patternModerate
Why is log(n/p) asymptotically less than log(n)/log(p)
Viewed 0 times
whylogthanasymptoticallyless
Problem
I'm trying to figure out which is better asymptotic complexity, $O(\log{\frac{n}{p}})$ or $O\left(\frac{\log{n}}{\log{p}}\right)$. $p$ is the amount of parallelism (i.e. number of cores), and $n$ is the problem size. When I plot them in Grapher.app with any constant value for $p$, the first looks better:
But when I try to work out the math it doesn't seem right:
$$
O(\log{\frac{n}{p}}) = O(\log{n} - \log{p})
$$
$O\left(\frac{\log{n}}{\log{p}}\right)$ seems like a better bound than the above since it divides by $\log{p}$ instead of just subtracting it. What am I missing?
But when I try to work out the math it doesn't seem right:
$$
O(\log{\frac{n}{p}}) = O(\log{n} - \log{p})
$$
$O\left(\frac{\log{n}}{\log{p}}\right)$ seems like a better bound than the above since it divides by $\log{p}$ instead of just subtracting it. What am I missing?
Solution
I assume that $p > 1$ (which is equivalent to $\log p > 0$, otherwise we have negative-valued functions which are meaningless as complexity measures.
For a given $p$, $O\left(\frac{\log n}{\log p}\right) = O(\log n)$ since big oh isn't affected by multiplication by a positive constant.
$\log \frac{n}{p} = \log n - \log p$. Since $\log p = o(\log n)$, $O\left(\log \frac{n}{p}\right) = O(\log n)$.
So up to big-oh complexity (or big-theta, for the same reasons), these two classes are the same.
Your diagram doesn't show a visual difference: you need to bring the two curves to the same scale. One is $a \log n$ for some constant $a > 0$, the other is $\log n + b$ for some constant $b$.
If you want to study the variation in $p$, then $O$ or $\Theta$ with respect to the variable $n$ are not good ways of modeling the complexity of your problem. You need more precise approximations that treat multiplicative constants as relevant. The complexity in $p$ is probably relevant to your problem, and is very different. I'm not familiar enouhg with parallelism to know whether it's the right measure to use here, this could warrant a separate question (with more information about what you're modeling). You're going to need express the complexity in terms of $n$ and $p$, and look at how the variation in $p$ (not the complexity itself) behaves for large $n$.
For a given $p$, $O\left(\frac{\log n}{\log p}\right) = O(\log n)$ since big oh isn't affected by multiplication by a positive constant.
$\log \frac{n}{p} = \log n - \log p$. Since $\log p = o(\log n)$, $O\left(\log \frac{n}{p}\right) = O(\log n)$.
So up to big-oh complexity (or big-theta, for the same reasons), these two classes are the same.
Your diagram doesn't show a visual difference: you need to bring the two curves to the same scale. One is $a \log n$ for some constant $a > 0$, the other is $\log n + b$ for some constant $b$.
If you want to study the variation in $p$, then $O$ or $\Theta$ with respect to the variable $n$ are not good ways of modeling the complexity of your problem. You need more precise approximations that treat multiplicative constants as relevant. The complexity in $p$ is probably relevant to your problem, and is very different. I'm not familiar enouhg with parallelism to know whether it's the right measure to use here, this could warrant a separate question (with more information about what you're modeling). You're going to need express the complexity in terms of $n$ and $p$, and look at how the variation in $p$ (not the complexity itself) behaves for large $n$.
Context
StackExchange Computer Science Q#21570, answer score: 10
Revisions (0)
No revisions yet.