HiveBrain v1.2.0
Get Started
← Back to all entries
patternMinor

Intuition behind the Master Theorem

Submitted by: @import:stackexchange-cs··
0
Viewed 0 times
themastertheorembehindintuition

Problem

The Master Theorem provides a method of solving recurrence relations for divide-and-conquer algorithms. It was first presented to me in my intro algorithms class as the following:


For a recurrence of the form $T(n) = aT(\frac{n}{b})+f(n)$
with constants $a \ge 1$ and $b>1$,


Case 1: if $f(n) = O(n^{\log_b(a-\varepsilon)})$ for some $\varepsilon > 0$, then $T(n) = \Theta(n^{\log_b(a)})$


Case 2: if $f(n) = \Theta(n^{\log_b(a)})$, then $T(n) = \Theta(\log n \cdot n^{\log _b(a)})$


Case 3: if $f(n) = \Omega(n^{log_b(a+\varepsilon)})$ for some $\varepsilon > 0$, and if $a f(\frac{n}{b}) \le c f(n)$ for some
constant $c

-
I think the second answer is closer to the kind of intuition I'm looking for. It explains the $aT(\frac{n}{b})$ term and the $f(n)$ term as opposing forces of replication and merging, respectively. However, I'm having trouble understanding how the intuitive cases (the replication step winning, the merging step winning, and the forces being equal) actually translate into the three different cases of the Master Theorem.

This explanation also follows a similar argument and simplifies the Master Theorem in the following way:


If $T(n) = aT(\frac{n}{b})+f(n)$ then compare $n^{\log_b a}$ with $f(n)$



  • If $f(n)



  • If they are equal, then $T(n)=f(n)\log n$



  • If $f(n) > n^{\log_b a}$, then $T(n)=f(n)$




While I see the connection with the Master Theorem cases here, there isn't an intuitive explanation for where the $n^{\log_b(a)}$ term comes from. That is, if I were just looking at the recurrence relation $T(n) = aT(\frac{n}{b})+f(n)$, how would I know to compare $f(n)$ with $n^{\log_b(a)}$? Perhaps there is some connection between $n^{\log_b(a)}$ and $aT(\frac{n}{b})$ that I'm missing?

In summary, I'm looking for something of the form "here is a high level concept/understanding (eg. geometric growth/decay, competing forces), and now here is how you can derive the Master Theorem from that idea." I would be sati

Solution

You can use repeated substitution to obtain
$$
T(n) = f(n) + af(n/b) + a^2f(n/b^2) + \cdots
$$
Now suppose that $f(n) = n^\gamma$. Then
$$
\begin{align*}
T(n) &= n^\gamma + a (n/b)^\gamma + a^2 (n/b^2)^\gamma + \cdots \\ &=
n^\gamma \left[ 1 + \frac{a}{b^\gamma} + \left(\frac{a}{b^\gamma}\right)^2 + \cdots \right].
\end{align*}
$$

Let us assume that $n$ is a power of $b$, and that there is a base case $T(n) = C$ for some $C>0$.

There are three different cases to consider:

-
If $a

-
If $a = b^\gamma$ then all terms are the same. Since there are $\log_b n+1$ of them, $T(n) = \Theta(n^\gamma \log n)$.

-
If $a > b^\gamma$ then the terms keeps increasing, and so we can approximate $T(n)$ by the last term, which is $a^{\log_b n} C = \Theta(n^{\log_b a})$.

The condition $a \lesseqqgtr b^\gamma$ is the same as $\log_b a \lesseqqgtr \gamma$, which is how the theorem is usually presented.

The actual theorem has some extra conditions that are needed in some of the cases when $f(n)$ is not a function of the exact form $n^\gamma$, but these are just technicalities.

Context

StackExchange Computer Science Q#85944, answer score: 8

Revisions (0)

No revisions yet.