HiveBrain v1.2.0
Get Started
← Back to all entries
patternMinor

Can someone clarify landau symbols definition please?

Submitted by: @import:stackexchange-cs··
0
Viewed 0 times
definitionclarifycanpleasesomeonelandausymbols

Problem

I'm more or less familiar with the landau symbols, most specifically in computer science for complexity, however I was wondering if someone could clarify a bit for me. I'll just mention that I know that technically the correct notation is to say $f(n)$ is an element of $O(f(n))$, not $f(n) = O(f(n))$.

Loosely, my understanding is this, given$ f(n) = n^2 + 2n +log(n)$ Big O represents the upper magnitude, so it would be correct (according to a past CS tutor), but not very accurate, to say $O(f(n)) = n^3$, but incorrect to say $O(n) = log(n)$, whereas the $Ω(f(n)) $represents the lower bound, so it would be correct, but inaccurate to say $Ω(f(n)) = log(n)$ or (not sure about this one), $Ω(f(n)) = 1$.

I know that $Θ(f(n))$ is defined as the intersection of $O(f(n))$ and $Ω(f(n))$, but is it valid to drop off lower orders of magnitude in the case of $Θ(f(n))$? ie. is $Θ(f(n)) = n^2$ correct? If not, what is the correct equality?

What I am entirely unsure about is $o(f(n))$ and $ω(f(n))$. I've seen the mathematical definitions and I know that the only difference between them and $O(f(n))$ and $Ω(f(n)) $respectively is that rather than "there exists C", it is "for all C", but my understanding of mathematical symbols is (at this stage) embarrassing. If anyone could shed some light on all of the above it'd be appreciated, thanks.

Solution

In the case of $O(f(n))$ and $\Omega(f(n))$ you are correct in that they can be wildly inaccurate and still be correct, for instance, $n$ is $O(2^n)$ - this just does not provide good information.

For $\Theta$ yes you can drop the lower orders of magnitude. What "$f(n)$ is $\Theta(g(n))$" means is that there are two constants, $C_1$ and $C_2$, and two values $n_1$, $n_2$ such that $C_1g(n)>f(n)$ for all $n$ greater then $n_1$, and likewise $C_2g(n)Nf(n)$ for all $n$ greater then $n_2$. In your case we can see that multiplying $n^2$ even by 2 will dwarf the remaining values of $f(n)$ for sufficiently large $n$, likewise multiplying by 1 will make it always smaller.

Little $o$ and $\omega$ notation is used to describe a function as being orders of magnitude less then another. When we were in big $o$ notation, as long as one constant $C$ existed such that $Cg(n)>f(n)$ then $f(n)$ is $O(g(n))$, with little $o$, every possible $c$ must cause this to happen. The easiest way to find this is to take the limit of $\lim_{x \to \infty} \frac{f(n)}{g(n)}$: if it is zero then $f(n)$ is $o(g(n))$; if it is $\infty$ then $f(n)$ is $\omega(g(n))$

Context

StackExchange Computer Science Q#28811, answer score: 6

Revisions (0)

No revisions yet.