patternMinor
Can any constant-time operation be applied N times in O(log(N)) time?
Viewed 0 times
canconstantloganytimeoperationappliedtimes
Problem
Suppose
Example
If
f is a constant-time function. Is it always possible to apply it n times in O(log(n)) time? Example
If
f is not, then, apply_not_n_times(n,bool) = if n % 2 == 0 then bool else !bool, which is O(log(n)), since it is linear in the number of bits. That means the answer is yes for that particular choice of f. I wonder if it can be proven that, for any choice of f, there is always a O(log(n)) implementation of apply_f_n_times(n,x).Solution
Not necessarily. Let us use the most common (according to some people) model of computation: the single tape Turing machine where the output of an algorithm is simply what is left on the tape. Here is a counterexample by Bergi. The operation $m$ that moves the head one step forward and writes a fixed symbol will absolutely need $n$ steps if we want to repeat $m$ $n$ times.
On the other hand, in some models of computation, any constant-time operation can be applied $N$ times in $\mathcal{O}(\log(N))$ time and in $\mathcal O(1)$ amortized time.
This section is devoted to prove the above partial positive answer.
For the sake of discussion/proof/disproof, let us make OP's question precise.
Let $f$ be a function that maps a set $S$ to $S$ itself. Define function $r(f,n)$ as applying $f$ repeatedly $n$ times. That is, for any $n\in\Bbb N$, $r(f,n): S\to S$ is defined by the following.
$$ r(f,n) := \begin{cases}
f & \text{ if }n = 1,\\
f\circ r(f,n-1) &\text{ otherwise.}\\
\end{cases}$$
Claim: Suppose we can compute $f$ in constant time and $f(S)$ contains finitely many values, then we can compute $r(f,n)$ in $\mathcal{O}(\log n)$ time and in $\mathcal O(1) amortized time.
Proof of the claim: (The simple observation is that we are dealing with states whose cardinality is smaller than a constant.)
Without affecting the correctness of this proof, we will assume that $|S|$ is smaller than some constant; Otherwise, we can replace $S$ by $f(S)$ and $f$ by itself restriction to $f(S)$.
Let $SS$ be the set of functions from $S$ to $S$. Since $|S|$ is smaller than some constant, $|SS|$ is also smaller than some constant. So the sequence of functions $r(f,1), r(f,2), \cdots$ will be eventually periodic since each term in the sequence is determined by the previous term in the same way, that is, composition of the previous term with $f$. So, there are constants $c_1, c_2$ such that $r(f,n+c_1) = r(f,n)$ for any $n>c_2$. Now let us just compute and store the two dimensional constant array $a[i,s]$ where $a(i,s)=r(f,i)(s)$ for $1\leq i\le c_2+c_1$ and $s\in S$ in constant time. Given $N>c_2+c_1$, we will find the remainder $n_0$ of $N-c_2$ divided by $c_1$ in $\mathcal{O}(\log(N))$ time. Now we have
$$r(f,n)(s) = a(n_0+c_2,s)$$
Note that the only (possibly) non-constant operation needed for any $N$ is the one-time operation to get the remainder of $N$ divided by a constant. So once we have done some $\Theta(\log N)$ operations, the amortized time will become $\mathcal O(1)$.
We have proved the claim.
In which models of computation?
Now we want to show that there exists some models of computation such that $f$ can only have finitely many values if we can compute function $f$ in constant time or, more precisely, in less than $c$ steps for some constant $c$ independent of the input, $f(S)$ can only have finitely many values.
One such model of computation is, as mentioned by David Richerby, the multi-tape TMs where the output must be written on a designated tape which must not have any part of the input. (I would venture to say this might be the most common model that fits most people's mental image of computation, where we have input at one side of a box and output at the other side of a box. However, whether this model is even common or not is not the focus of this post.) In this mode of computation, since any constant-time operation $f$ can only move the head on that designated tape at most $c$ steps forward and at most $c$ steps backward for some constant $c$, it can produce at most $|\Sigma|^{|2c+1|}$ states, where $\Sigma$ is the set of symbols used on that designated tape. That is, $f$ can only have finitely many values.
Because of the multitude of the models of computation, we will not and cannot list all those complying models of computation. Anyway, thanks to the claim we just proved, in those models of computations, we will have the partial positive answer stated in the second paragraph of this post.
Although we have shown that in some models of computations, we can apply a given constant-time operation $N$ times in $\mathcal O(\log(N))$ time, the constant term hidden in the $\mathcal O$ notation might be impractically large. For example, for some endomorphism $f$ of a set of 1000 elements, the smallest $c_1=1516385558956728808659224171023365600$ (which is $g(1000)$ for Laudau's function g). For a cryptographic hash operation which can have about $2^{256}$ state, the smallest $c_1$ might have more than trillions of digits. Although that constant term can be reduced significantly, it remains impractically large. In that sense, that partial positive answer may not be what OP might have been hoping for at all. It does not lessen our confidence in those cryptographic hash operations in anyway.
On the other hand, in some models of computation, any constant-time operation can be applied $N$ times in $\mathcal{O}(\log(N))$ time and in $\mathcal O(1)$ amortized time.
This section is devoted to prove the above partial positive answer.
For the sake of discussion/proof/disproof, let us make OP's question precise.
Let $f$ be a function that maps a set $S$ to $S$ itself. Define function $r(f,n)$ as applying $f$ repeatedly $n$ times. That is, for any $n\in\Bbb N$, $r(f,n): S\to S$ is defined by the following.
$$ r(f,n) := \begin{cases}
f & \text{ if }n = 1,\\
f\circ r(f,n-1) &\text{ otherwise.}\\
\end{cases}$$
Claim: Suppose we can compute $f$ in constant time and $f(S)$ contains finitely many values, then we can compute $r(f,n)$ in $\mathcal{O}(\log n)$ time and in $\mathcal O(1) amortized time.
Proof of the claim: (The simple observation is that we are dealing with states whose cardinality is smaller than a constant.)
Without affecting the correctness of this proof, we will assume that $|S|$ is smaller than some constant; Otherwise, we can replace $S$ by $f(S)$ and $f$ by itself restriction to $f(S)$.
Let $SS$ be the set of functions from $S$ to $S$. Since $|S|$ is smaller than some constant, $|SS|$ is also smaller than some constant. So the sequence of functions $r(f,1), r(f,2), \cdots$ will be eventually periodic since each term in the sequence is determined by the previous term in the same way, that is, composition of the previous term with $f$. So, there are constants $c_1, c_2$ such that $r(f,n+c_1) = r(f,n)$ for any $n>c_2$. Now let us just compute and store the two dimensional constant array $a[i,s]$ where $a(i,s)=r(f,i)(s)$ for $1\leq i\le c_2+c_1$ and $s\in S$ in constant time. Given $N>c_2+c_1$, we will find the remainder $n_0$ of $N-c_2$ divided by $c_1$ in $\mathcal{O}(\log(N))$ time. Now we have
$$r(f,n)(s) = a(n_0+c_2,s)$$
Note that the only (possibly) non-constant operation needed for any $N$ is the one-time operation to get the remainder of $N$ divided by a constant. So once we have done some $\Theta(\log N)$ operations, the amortized time will become $\mathcal O(1)$.
We have proved the claim.
In which models of computation?
Now we want to show that there exists some models of computation such that $f$ can only have finitely many values if we can compute function $f$ in constant time or, more precisely, in less than $c$ steps for some constant $c$ independent of the input, $f(S)$ can only have finitely many values.
One such model of computation is, as mentioned by David Richerby, the multi-tape TMs where the output must be written on a designated tape which must not have any part of the input. (I would venture to say this might be the most common model that fits most people's mental image of computation, where we have input at one side of a box and output at the other side of a box. However, whether this model is even common or not is not the focus of this post.) In this mode of computation, since any constant-time operation $f$ can only move the head on that designated tape at most $c$ steps forward and at most $c$ steps backward for some constant $c$, it can produce at most $|\Sigma|^{|2c+1|}$ states, where $\Sigma$ is the set of symbols used on that designated tape. That is, $f$ can only have finitely many values.
Because of the multitude of the models of computation, we will not and cannot list all those complying models of computation. Anyway, thanks to the claim we just proved, in those models of computations, we will have the partial positive answer stated in the second paragraph of this post.
Although we have shown that in some models of computations, we can apply a given constant-time operation $N$ times in $\mathcal O(\log(N))$ time, the constant term hidden in the $\mathcal O$ notation might be impractically large. For example, for some endomorphism $f$ of a set of 1000 elements, the smallest $c_1=1516385558956728808659224171023365600$ (which is $g(1000)$ for Laudau's function g). For a cryptographic hash operation which can have about $2^{256}$ state, the smallest $c_1$ might have more than trillions of digits. Although that constant term can be reduced significantly, it remains impractically large. In that sense, that partial positive answer may not be what OP might have been hoping for at all. It does not lessen our confidence in those cryptographic hash operations in anyway.
Context
StackExchange Computer Science Q#96289, answer score: 5
Revisions (0)
No revisions yet.