patternMinor
Composition of functions computable in logspace
Viewed 0 times
functionscomputablelogspacecomposition
Problem
The bit-graph of $f\colon \{0,1\}^ \rightarrow \{0,1\}^$ is the language:
$\text{BIT}_f := \{\langle x,i \rangle : 1\leq i \leq|f(x)| \text{ and the $i$-th bit of } f(x) \text{ is } 1\}$
It is said that $f$ is logspace computable if $\text{BIT}_f$ is decidable in space $O(\log n)$. Decidable means that there exist a Turing Machine $M$ such that:
Prove that the composition $(f \circ g)(x)=f(g(x))$ of two logspace computable functions $f,g$ is also a logspace computable function.
Any hint on this exercise? What I tried so far is playing with the composition of the two Turing Machines associated with $f$ and $g$, but I didn't succeed because it always end up with a case analysis exercise.
$\text{BIT}_f := \{\langle x,i \rangle : 1\leq i \leq|f(x)| \text{ and the $i$-th bit of } f(x) \text{ is } 1\}$
It is said that $f$ is logspace computable if $\text{BIT}_f$ is decidable in space $O(\log n)$. Decidable means that there exist a Turing Machine $M$ such that:
- if $\langle x,i \rangle \in \text{BIT}_f$ then $M(\langle x,i \rangle) = 1$
- if $\langle x,i \rangle \notin \text{BIT}_f$ then $M(\langle x,i \rangle) = 0$
Prove that the composition $(f \circ g)(x)=f(g(x))$ of two logspace computable functions $f,g$ is also a logspace computable function.
Any hint on this exercise? What I tried so far is playing with the composition of the two Turing Machines associated with $f$ and $g$, but I didn't succeed because it always end up with a case analysis exercise.
Solution
The idea is quite simple. We are going to simulate running $M_f$ (the machine for $f$) on $g(x)$. Let $m = |g(x)|$, and note that $\log m = O(\log n)$. In order to do that, we keep track of the location of the input tape head of $M_f$ (this takes $O(\log m)$ space), as well as the contents of the work tapes (which also takes $O(\log m)$ space). Whenever $M_f$ attempts to read from the input tape, we invoke the machine computing $\mathrm{BIT}_g$, which also uses up $O(\log n)$ steps; we know which input $i$ to give, since this is just the location of the head.
Context
StackExchange Computer Science Q#121094, answer score: 3
Revisions (0)
No revisions yet.