HiveBrain v1.2.0
Get Started
← Back to all entries
patternMinor

Time complexity of recursive power code

Submitted by: @import:stackexchange-cs··
0
Viewed 0 times
timepowerrecursivecodecomplexity

Problem

While I was learning about time complexity of recursive functions, I came across this code to calculate $x^n$:

power (x, n) {  
    if n == 0
        return 1 
    if n is even
        return power(x, n/2) * power(x, n/2)  
    if n is odd    
        return power(x, n/2) * power(x, n/2) * x


According to the book, its complexity is $\Theta(n)$ which seems rather strange to me. In my eyes, this code should have complexity $\Theta( n^2)$.

What am I seeing wrong?

Solution

An Answer

I will take your code at face value under the Uniform Cost Model.

Let's now assume $n$ is of the form $n = 2^k$ where $k \in \mathbb{N} \wedge k \geq 0$ (e.g. $n$ is a power of $2$).

We then get a recurrence relation like so:

$$\begin{align}
T(x, n) & = 2 \cdot T\left(x, \frac{n}{2} \right) + c \\[0.5em]
& = 4 \cdot T\left(x, \frac{n}{4} \right) + 2c + c \\
& = \vdots \\[0.5em]
& = \sum_{i = 1}^{\log_2 n} 2^{i - 1} \cdot c\\[0.5em]
& = c(n - 1)\\
& = \Theta(n)
\end{align}$$

Some Fun

Now let's take a step back and check out the Logarithmic Cost Model. Let's define the cost of multiplication as $g(n)$ where $n = \log_2 x$ (the size of $x$ in binary) and $x$ maximum of the two arguments.

Along with our prior assumption about $n$, let's also assume $x$ is of the form $x = 2^k$ where $k \in \mathbb{N} \wedge k \geq 0$ (e.g. $x$ is a power of $2$).

We get a recurrence of the form:

$$\begin{align}
T(n) & = 2 \cdot T\left(\frac{n}{2} \right) + g\left(\frac{n}{2}\right) \\[0.5em]
& = 4 \cdot T\left(\frac{n}{4} \right) + 2 \cdot g\left(\frac{n}{4}\right) + g\left(\frac{n}{2}\right) \\
& = \vdots \\[0.5em]
& = \sum_{i = 1}^{\log_2 n} 2^{i - 1} \cdot g\left(\frac{n}{2^i}\right)\\[0.5em]
\end{align}$$

Now we can analyze some interesting multiplication methods:

  • Multiplication is linear in size, e.g. $g(n) = n$.



$$
\begin{align}
T(n) & = \Theta(n \log n)
\end{align}$$

  • Multiplication is quadratic in size, e.g. $g(n) = n^2$.



$$\begin{align}
T(n) & = \Theta(n^2)
\end{align}$$

  • Karatsuba Multiplication, the divide and conquer method, e.g. $g(n) = \Theta(n^{\log_2 3})$



$$\begin{align}
T(n) & = \Theta(n^{\log_2 3})
\end{align}$$

-
Just for fun, and to plug Yuval's answer to my own question, multiplication is super-linear-polylogarithmic and sub-polynomial-polylogarithmic, e.g. $g(n) = \Theta(n\log n \cdot 2^{\sqrt{2\log n}})$. We also assume size $n$ for $g$ instead of $\frac{n}{2}$.

$$\begin{align}
T(n) & = 2 \cdot T\left(\frac{n}{2} \right) + \Theta\left(n\log n \cdot 2^{\sqrt{2\log n}}\right) \\[0.5em]
\end{align}$$

We use case 3 of the Master Theorem to get:

$$\begin{align}
T(n) & = \Theta\left(n\log n \cdot 2^{\sqrt{2\log n}}\right) \\[0.5em]
\end{align}$$

Essentially, but not exactly, as the time complexity becomes superlinear, the time it takes to multiply overtakes the recursive time. This can be seen as case 3 of the Master Theorem.

Even More Fun!

Now let's say this was implemented in a smart manner, not repeating the recursive calls or using DP.

We get a recurrence of the form:

$$\begin{align}
T(n) & = T\left(\frac{n}{2} \right) + g\left(\frac{n}{2}\right) \\[0.5em]
& = T\left(\frac{n}{4} \right) + g\left(\frac{n}{4}\right) + g\left(\frac{n}{2}\right) \\
& = \vdots \\[0.5em]
& = \sum_{i = 1}^{\log_2 n} g\left(\frac{n}{2^i}\right)\\[0.5em]
\end{align}$$

The major improvement now is when multiplication is linear in size, we now get linear exponentiation!

$$
\begin{align}
T(n) & = \Theta(n)
\end{align}$$

Context

StackExchange Computer Science Q#69539, answer score: 2

Revisions (0)

No revisions yet.