HiveBrain v1.2.0
Get Started
← Back to all entries
patternMinor

Do Turing machines assume something infinite at some point?

Submitted by: @import:stackexchange-cs··
0
Viewed 0 times
assumeinfinitepointsomemachinesturingsomething

Problem

In a previous question What exactly is an algorithm?, I asked whether having an "algorithm" that returns the value of a function based on an array of precomputed values was an algorithm.

One of the answers that caught my attention was this one:


The factorial example gets into a different model of computation,
called non-uniform computation. A Turing Machine is an example of a
uniform model of computation: It has a single, finite description, and
works for inputs of arbitrarily large size. In other words, there
exists a TM that solves the problem for all input sizes.


Now, we could instead consider computation as follows: For each input
size, there exists a TM (or some other computational device) that
solves the problem. This is a very different question. Notice that a
single TM cannot store the factorial of every single integer, since
the TM has a finite description. However, we can make a TM (or a
program in C) that stores the factorials of all numbers below 1000.
Then, we can make a program that stores the factorials of all numbers
between 1000 and 10000. And so on.

Doesn't every TM actually assume some way to deal with infinity? I mean, even a TM with a finite description that computers the factorial of any number N through the algorithm

int fact(int n) 
 { 
 int r = 1; 
 for(int i=2;i<=n;i++) 
 r = r*i; 
 return r; 
 }


contains the assumption that a TM has the "hardware" to compare numbers of arbitrary size through the "<=" comparator, and also ADDers to increment i up to an arbitrary number, moreover, the capability of representing numbers of arbitrary size.

Am I missing something? Why is the approach that I presented in my other question less feasible with respect to infinity than this one?

Solution

A Turing machine does not have the ability "to compare numbers of arbitrary size through the <= comparator" because a Turing machine does not have a "<= comparator". A Turing machine has a fixed, finite set $Q$ of states and a fixed, finite tape alphabet $\Sigma$. At each step of the computation, the Turing machine looks at its current state and the symbol under its read/write head and decides what to do next: which state to enter, which symbol to write to the tape and which way to move the tape head.

Because of this, a Turing machine cannot compare arbitrarily large numbers in a single <= instruction. Using the state, it can remember at most $|Q|$ different numbers and, using the alphabet, it can write at most $|\Sigma|$ different numbers in a single tape cell (using each possible symbol to represent one number). As such, to compare arbitrarily large numbers on a Turing machine, you must write each number as a sequence of digits on the tape and write an algorithm that will take multiple steps to compare those two numbers. As you can imagine, this makes writing Turing machine programs a rather fiddly endeavour.

Turing machines don't really "deal with infinity": they deal with unbounded finite things, at least in their standard definition. The input is a finite string and, after any finite number of steps, the machine has only examined or written to a finite number of tape cells. There is no bound on the size of the input or the number of computation steps but, the input is finite and, after any finite number of steps, only a finite amount of output has been produced.

Context

StackExchange Computer Science Q#32387, answer score: 9

Revisions (0)

No revisions yet.