patternModerate
Combinational Logic Circuits and Theory of Computation
Viewed 0 times
computationtheorylogiccombinationalcircuitsand
Problem
I'm trying to link Combinational Logic Circuits ( computers based on logical gates only ) with everything I have learned recently in Theory of Computation.
I was wondering whether combinational logic circuits can implement computations in the same way finite state machines can. They seem radically different :
Finite State Machines, however, have a well-defined memory in the form of the states that it can be in. Combinational logic circuits, however, don't have a well-defined memory so to implement algorithms that need memory they kinda use some weird method of serial connection ( see how $C_{out}$ of previous adder is connected to $C_{in}$ of current adder in the image bellow ) .
However radically different might seem, they both seems to be doing computations. For instance, both can implement an algorithm for binary addition ( and even binary multiplication ) however different those implementations might be :
FSM :
Combinational Logic Circuit ( C, as in $C_{in}$ and $C_{out}$, stands for Carry ) :
I'm even thinking ( although still very uncertain ) that we can convert every FSM into a corresponding Combinational Logic Circuit.
So, i'm asking myself :
Can Combinational Logic Circuits also be considered a instantaneous kind of model of computation ? Can we apply all concepts we learn in Computability Theory and Computational Complexity Theory, like space complexity and computability, to it ?
On one hand, it seems like they don't fit as a model of computation because they don't have elementary operations ( like reading/writing of a tape, function reduction, steps on the proof search of logical programming paradigma ), they implement their computations instantaneously.
But on the other hand, they seem to be fit as a model of computation because we can model all kinds of computation with them ( binary addition is one example ), and they can be viewed abstractly ( by only focusing on the truth-tables and the logical gate
I was wondering whether combinational logic circuits can implement computations in the same way finite state machines can. They seem radically different :
Finite State Machines, however, have a well-defined memory in the form of the states that it can be in. Combinational logic circuits, however, don't have a well-defined memory so to implement algorithms that need memory they kinda use some weird method of serial connection ( see how $C_{out}$ of previous adder is connected to $C_{in}$ of current adder in the image bellow ) .
However radically different might seem, they both seems to be doing computations. For instance, both can implement an algorithm for binary addition ( and even binary multiplication ) however different those implementations might be :
FSM :
Combinational Logic Circuit ( C, as in $C_{in}$ and $C_{out}$, stands for Carry ) :
I'm even thinking ( although still very uncertain ) that we can convert every FSM into a corresponding Combinational Logic Circuit.
So, i'm asking myself :
Can Combinational Logic Circuits also be considered a instantaneous kind of model of computation ? Can we apply all concepts we learn in Computability Theory and Computational Complexity Theory, like space complexity and computability, to it ?
On one hand, it seems like they don't fit as a model of computation because they don't have elementary operations ( like reading/writing of a tape, function reduction, steps on the proof search of logical programming paradigma ), they implement their computations instantaneously.
But on the other hand, they seem to be fit as a model of computation because we can model all kinds of computation with them ( binary addition is one example ), and they can be viewed abstractly ( by only focusing on the truth-tables and the logical gate
Solution
Logic circuits are common in complexity theory, where they go by the name circuits. There is a big difference between circuits and models of computation such as the Turing machine: each circuit can only handle inputs of fixed size. In order to fix this, under the circuit computation model, for every input length $n$ there is a circuit $C_n$, and together they compute a function on strings of arbitrary length. This computation model, as stated, is too strong: it can compute uncomputable functions, indeed all functions. The problem is that an infinite sequence of circuits doesn't necessarily have a finite description. In order to fix this problem, we usually demand that the circuits be uniform, that is, that they be generated by some Turing machine, which on input $n$ generates $C_n$.
The circuit model is especially popular in complexity theory, even without the uniformity restriction. The reason is the following observation: a Turing machine running in polynomial time can be converted to a (uniform) sequence of circuits of polynomial size, using essentially the ideas of Cook's theorem (which shows that SAT is NP-complete). Hence if you want to prove that a certain problem cannot be solved in polynomial time, it is enough to show that it cannot be solved by polynomial size circuits.
The circuit model is especially popular in complexity theory, even without the uniformity restriction. The reason is the following observation: a Turing machine running in polynomial time can be converted to a (uniform) sequence of circuits of polynomial size, using essentially the ideas of Cook's theorem (which shows that SAT is NP-complete). Hence if you want to prove that a certain problem cannot be solved in polynomial time, it is enough to show that it cannot be solved by polynomial size circuits.
Context
StackExchange Computer Science Q#37867, answer score: 12
Revisions (0)
No revisions yet.