HiveBrain v1.2.0
Get Started
← Back to all entries
gotchaMinor

Why does it take $O(f(n)^2)$ to simulate a 3-tape $O(f(n))$-time TM on a 1-tape TM?

Submitted by: @import:stackexchange-cs··
0
Viewed 0 times
whytaketimetapedoessimulate

Problem

This looks like a fundamental result but I can't find a resource online that gives an intuitive interpretation of this complexity. Any basic explanation is appreciated.

Solution

Have a look in the book of Hopcroft and Ullman "Introduction to automata theory, languages and computation" (ISBN 0-201-02988-X). I have only the German version of this book, but there it is mentioned as Theorem 12.5.

However the basic idea behind the tape-reduction from $k$ tapes to 1 is quite easy. Assume the running time is bounded by $f(n)$. You boost you working alphabet $\Gamma$, such that every super-character represents $k$ original characters and $k$ Flags. The flags store the position of the head of the corresponding simulated tape. To simulate one step of the old machine, we have to scan through the whole tape. However, the tape contains at most $O(f(n))$ characters, hence the scan needs $O(f(n))$ time. Since we simulate $O(f(n))$ steps the 1-tape machine runs in $O(f(n)^2)$. I think from here you can work out the details on your own.

Context

StackExchange Computer Science Q#4989, answer score: 6

Revisions (0)

No revisions yet.