patternMinor
In algorithm analysis what does it mean for bounds to be "tight"
Viewed 0 times
tightwhatanalysismeanalgorithmfordoesbounds
Problem
For example we could say alg(x) runs big omega(n) but this bound is not "tight".
What is meant by "tight"? Is it that the bound isn't at its maximum?
So maybe a tighter bound could be big omega (1)??
What is meant by "tight"? Is it that the bound isn't at its maximum?
So maybe a tighter bound could be big omega (1)??
Solution
Generally speaking, a bound is tight if there is no "better" bound.
Take, for instance, $f(n) = \sin(n)$. Then, $-1.5 \leq f(n) \leq 2n$ are valid bounds on $f$ but they are not tight -- there are better bounds, e.g. $-1.2 \leq f(n) \leq 1.5n$.
Now, $-1 \leq f(n) \leq 1$ are tight bounds -- there are no tighter ones.
You will note that the range of $n$ is relevant here. I implicitly let $n \in \mathbb{N}$ (or $\mathbb{R}$ if you want) here; if we consider $n \in \{ \pi i \mid i \in \mathbb{N} \}$ there are better bounds.
Now, using Landau symbols the terminology is used similarly: $f \in \Omega(n)$ is a tight bound if there is no "better" $\Omega$-bound, i.e. one with a faster-growing parameter function. If we also know that $f \in O(n)$, then both bounds are tight and we have $f \in \Theta(n)$. It is, however, very possible that you have, say, $f \in \Omega(n)$ and $f \in O(n^2)$ and both bounds are tight.
Here, and in particular in the context of algorithm analysis, the range of (the implicit) parameters is relevant as well: are you talking about all inputs? Worst-case inputs? The average case? For instance, $\Omega(n)$ is a tight (lower) bound on the asymptotic running time of (improved) Bubblesort if you consider the best case, but not for average or worst case.
Note that "tight" can be relative to your goals. If you want to prove asymptotics of the form $f \sim g$, you wouldn't call a $\Theta$-bound "tight".
Take, for instance, $f(n) = \sin(n)$. Then, $-1.5 \leq f(n) \leq 2n$ are valid bounds on $f$ but they are not tight -- there are better bounds, e.g. $-1.2 \leq f(n) \leq 1.5n$.
Now, $-1 \leq f(n) \leq 1$ are tight bounds -- there are no tighter ones.
You will note that the range of $n$ is relevant here. I implicitly let $n \in \mathbb{N}$ (or $\mathbb{R}$ if you want) here; if we consider $n \in \{ \pi i \mid i \in \mathbb{N} \}$ there are better bounds.
Now, using Landau symbols the terminology is used similarly: $f \in \Omega(n)$ is a tight bound if there is no "better" $\Omega$-bound, i.e. one with a faster-growing parameter function. If we also know that $f \in O(n)$, then both bounds are tight and we have $f \in \Theta(n)$. It is, however, very possible that you have, say, $f \in \Omega(n)$ and $f \in O(n^2)$ and both bounds are tight.
Here, and in particular in the context of algorithm analysis, the range of (the implicit) parameters is relevant as well: are you talking about all inputs? Worst-case inputs? The average case? For instance, $\Omega(n)$ is a tight (lower) bound on the asymptotic running time of (improved) Bubblesort if you consider the best case, but not for average or worst case.
Note that "tight" can be relative to your goals. If you want to prove asymptotics of the form $f \sim g$, you wouldn't call a $\Theta$-bound "tight".
Context
StackExchange Computer Science Q#74013, answer score: 6
Revisions (0)
No revisions yet.