patternMinor
Improving time complexity from O(log n/loglog n) to O((log ((nloglog n)/log n))/loglog ((nloglog n)/log n))
Viewed 0 times
nlogloglogtimeloglogimprovingfromcomplexity
Problem
Suppose I have an algorithm whose running time is
$O(f(n))$ where $f(n) = O\left(\frac{\log n}{\log\log n}\right)$
And suppose I can change this running time in $O(1)$ steps into $O\left(f\left(\frac{n}{f(n)}\right)\right)$, i.e. I can get an algorithm whose running time is $O(g(n)) = O\left(\frac{\log\frac{n}{\frac{\log n}{\log\log n}}} {\log\log\frac{n}{\frac{\log n}{\log\log n}}}\right) = O\left(\frac{\log\frac{n\log\log n}{\log n}} {\log\log\frac{n\log\log n}{\log n}}\right)$.
I'm pretty sure that $g(n)
-
Is $g(n)$ asymptotically better the $f(n)$, i.e. is $g(n) = o(f(n))$
-
Assuming this is asymptotically better, I can do this step again and further improve the running time of the algorithm. Meaning that in 1 more step I can make my algorithm run in time of $O\left(\frac{n}{f\left(\frac{n}{f(n)}\right)}\right)$, and I can repeat this process as many times as I want. How many times should the process be repeated to get the best asymptotically running times and what will it be? obviously repeating it $O(f(n))$ times will already have a running time of $O(f(n))$ only for the repetition of this process and will not improve the overall algorithm complexity.
$O(f(n))$ where $f(n) = O\left(\frac{\log n}{\log\log n}\right)$
And suppose I can change this running time in $O(1)$ steps into $O\left(f\left(\frac{n}{f(n)}\right)\right)$, i.e. I can get an algorithm whose running time is $O(g(n)) = O\left(\frac{\log\frac{n}{\frac{\log n}{\log\log n}}} {\log\log\frac{n}{\frac{\log n}{\log\log n}}}\right) = O\left(\frac{\log\frac{n\log\log n}{\log n}} {\log\log\frac{n\log\log n}{\log n}}\right)$.
I'm pretty sure that $g(n)
-
Is $g(n)$ asymptotically better the $f(n)$, i.e. is $g(n) = o(f(n))$
-
Assuming this is asymptotically better, I can do this step again and further improve the running time of the algorithm. Meaning that in 1 more step I can make my algorithm run in time of $O\left(\frac{n}{f\left(\frac{n}{f(n)}\right)}\right)$, and I can repeat this process as many times as I want. How many times should the process be repeated to get the best asymptotically running times and what will it be? obviously repeating it $O(f(n))$ times will already have a running time of $O(f(n))$ only for the repetition of this process and will not improve the overall algorithm complexity.
Solution
Here are the functions.
$$f(n) = \frac{\log(n)}{\log\log(n)}$$
$$h(n)=\frac n{f(n)}=\frac{n\log\log(n)}{\log(n)}$$
$$g(n)=f(h(n))=\frac{\log(\frac{n\log\log(n)}{\log(n)})} {\log\log(\frac{n\log\log(n)}{\log(n)})}$$
Yes, here is a proof.
Let us compute the derivative of $f(x)$ with respect to $x$ as a real variable.
$$\frac{df}{dx}
=\frac{\frac{\log\log x}{x}-\frac{\log x}{x\log x}}{(\log\log x)^2}
=\frac{\log\log x-1}{x\ (\log\log x)^2}$$
So $\frac{df}{dx}\gt0$ when $e^e e^{e+2}$.
No, $g(n) = \Theta(f(n))$. In fact, $\displaystyle\lim_{n\to\infty}\frac{g(n)}{f(n)}=1$
Well, that assumption is invalid.
Here are two exercises.
Exercise 1. Show that $\displaystyle\lim_{n\to\infty}\frac{g(n)}{f(n)}=1$ using the following hint or steps.
-
Recall that $\log(n) 0$ when $n$ is large enough.
-
Show that $\displaystyle\lim_{n\to\infty}\frac{\log(\frac{n\log\log(n)}{\log(n)})}{\log(n)}=1$.
-
Show that $\displaystyle\lim_{n\to\infty}\frac{\log\log(\frac{n\log\log(n)}{\log(n)})}{\log\log(n)}=1$.
Exercise 2. Show that $g(n)−f(n)=O(1)$. (This exercise might be hard.)
$$f(n) = \frac{\log(n)}{\log\log(n)}$$
$$h(n)=\frac n{f(n)}=\frac{n\log\log(n)}{\log(n)}$$
$$g(n)=f(h(n))=\frac{\log(\frac{n\log\log(n)}{\log(n)})} {\log\log(\frac{n\log\log(n)}{\log(n)})}$$
- Is $g(n)
Yes, here is a proof.
Let us compute the derivative of $f(x)$ with respect to $x$ as a real variable.
$$\frac{df}{dx}
=\frac{\frac{\log\log x}{x}-\frac{\log x}{x\log x}}{(\log\log x)^2}
=\frac{\log\log x-1}{x\ (\log\log x)^2}$$
So $\frac{df}{dx}\gt0$ when $e^e e^{e+2}$.
- Is $g(n)$ asymptotically better the $f(n)$, i.e. is $g(n) = o(f(n))$?
No, $g(n) = \Theta(f(n))$. In fact, $\displaystyle\lim_{n\to\infty}\frac{g(n)}{f(n)}=1$
- Assuming this is asymptotically better, I can do ...
Well, that assumption is invalid.
Here are two exercises.
Exercise 1. Show that $\displaystyle\lim_{n\to\infty}\frac{g(n)}{f(n)}=1$ using the following hint or steps.
-
Recall that $\log(n) 0$ when $n$ is large enough.
-
Show that $\displaystyle\lim_{n\to\infty}\frac{\log(\frac{n\log\log(n)}{\log(n)})}{\log(n)}=1$.
-
Show that $\displaystyle\lim_{n\to\infty}\frac{\log\log(\frac{n\log\log(n)}{\log(n)})}{\log\log(n)}=1$.
Exercise 2. Show that $g(n)−f(n)=O(1)$. (This exercise might be hard.)
Context
StackExchange Computer Science Q#119338, answer score: 4
Revisions (0)
No revisions yet.