HiveBrain v1.2.0
Get Started
← Back to all entries
patternMajor

What does a faster algorithm mean in theoretical computer science?

Submitted by: @import:stackexchange-cs··
0
Viewed 0 times
sciencewhatmeantheoreticalfasteralgorithmcomputerdoes

Problem

If there is an algorithm running in time $O(f(n))$ for some problem A, and somebody comes up with an algorithm running in time, $O(f(n)/g(n))$, where $g(n) = o(f(n))$, is it considered an improvement over the previous algorithm?


Does it make sense, in the context of theoretical computer science, to come up with such an algorithm?

Solution

No, an algorithm running in time $O(f(n)/g(n))$, where $g(n) = o(f(n))$, is not necessarily considered an improvement. For example, suppose that $f(n) = n$ and $g(n) = 1/n$. Then $O(f(n)/g(n)) = O(n^2)$ is a worse time bound than $O(f(n)) = O(n)$.

In order to improve upon an algorithm running in time $f(n)$, you need to come up with an algorithm running in time $o(f(n))$, that is, in time $g(n)$ for some function $g(n) = o(f(n))$.

If all you know is that an algorithm runs in time $O(f(n))$, then it is not clear whether an algorithm running in time $O(g(n))$ is an improvement, whatever $f(n),g(n)$ are. This is because big O is only an upper bound on the running time. Instead, it is common to consider the worst-case time complexity, and to estimate it as a big $\Theta$ rather than just as a big $O$.

Context

StackExchange Computer Science Q#96108, answer score: 26

Revisions (0)

No revisions yet.