patternMinor
Proof sketch of Blum's Speedup Theorem
Viewed 0 times
sketchspeeduptheoremblumproof
Problem
In his Quantum Computing Since Democritus, Scott Aaronson outlined a proof sketch of Blum's Speedup Theorem which roughly looks like the following.
Given an enumeration of Turing Machines $\{M\}_{i \in \mathbb{N}}$. Let $S_i = \{M_1, ..., M_i\}$. We can construct a computable function $f(n)$ such that if $f(n)$ can be computed in $O(2^n)$ steps then it can also be computed in $O(2^{n-1})$ in the following manner (Python pseudocode):
Aaronson claims we can "hardwire" the canceled Turing Machines from iteration 1 to i and skip to iteration i+1. The complexity goes from $O(n^2 2^n)$ to $O(n^2 2^{n-i})$.
Why is this a valid argument? Wouldn't a similar argument "we can cache all the results" turn every computable function into $O(1)$?
Reference: https://www.scottaaronson.com/democritus/lec5.html
Given an enumeration of Turing Machines $\{M\}_{i \in \mathbb{N}}$. Let $S_i = \{M_1, ..., M_i\}$. We can construct a computable function $f(n)$ such that if $f(n)$ can be computed in $O(2^n)$ steps then it can also be computed in $O(2^{n-1})$ in the following manner (Python pseudocode):
canceled = set()
for i in range(n):
for Mj in Si:
if Mj not in canceled and Mj halts in 2 ** (n-i) steps:
f(i) = 1 - output of Mj
canceled.add(Mj)
break
f(i) = 0
return f(n)Aaronson claims we can "hardwire" the canceled Turing Machines from iteration 1 to i and skip to iteration i+1. The complexity goes from $O(n^2 2^n)$ to $O(n^2 2^{n-i})$.
Why is this a valid argument? Wouldn't a similar argument "we can cache all the results" turn every computable function into $O(1)$?
Reference: https://www.scottaaronson.com/democritus/lec5.html
Solution
You cannot "cache all the results" because your program (or description of Turing machine) would need to be infinite. You can only cache a fixed number of results. In your example $i$ needs to be fixed. You cannot pick, e.g., $i=\frac{n}{2}$ and lower the complexity from $O(n^2 2^n)$ to $O(n^2 2^{n/2})$.
On an unrelated note $O(2^n)$ and $O(2^{n-1})$ are exactly the same set of functions so it is trivially true that a function that can be computed in time $O(2^n)$ can also be computed in time $O(2^{n-1})$ (or in time $O(2^{n-i})$ for any constant $i$).
The claim in the linked notes is stronger as it work for any complexity bound $t$. You need to pick a $t$ that grows quick enough to get something useful. In the notes $t$ is defined recursively as $t(n)= 2^{t(n-1)}$ which (assuming $t(0)=0$) corresponds to the power tower function that grows as $1, 2, 2^2,2^{2^2}, 2^{2^{2^2}}$, etc.
On an unrelated note $O(2^n)$ and $O(2^{n-1})$ are exactly the same set of functions so it is trivially true that a function that can be computed in time $O(2^n)$ can also be computed in time $O(2^{n-1})$ (or in time $O(2^{n-i})$ for any constant $i$).
The claim in the linked notes is stronger as it work for any complexity bound $t$. You need to pick a $t$ that grows quick enough to get something useful. In the notes $t$ is defined recursively as $t(n)= 2^{t(n-1)}$ which (assuming $t(0)=0$) corresponds to the power tower function that grows as $1, 2, 2^2,2^{2^2}, 2^{2^{2^2}}$, etc.
Context
StackExchange Computer Science Q#153860, answer score: 2
Revisions (0)
No revisions yet.