HiveBrain v1.2.0
Get Started
← Back to all entries
patternMinor

Big-O / $\tilde{O}$ -notation with multiple variables when function is decreasing in one of its arguments

Submitted by: @import:stackexchange-cs··
0
Viewed 0 times
argumentsdecreasingwithfunctionbigtildenotationonemultiplevariables

Problem

Say we have an algorithm that
takes an input a triple
($X$, $A$, $\epsilon$),
where $X$ is a sequence of $n$ bytes, of which the algorithm might query only a subset, and $A$ and $\epsilon$ are numbers with $A \geq 1$ and $\epsilon > 0$.
It is claimed that this algorithm runs in time

$$
\tilde{O}(\frac{n}{A^3 \epsilon})
$$

as $n \to \infty$, $A \to \infty$ and $\epsilon \to 0$.

(Note that $\tilde{O}$
means
big-O except ignoring logarithmic factors. For functions of one variable, $f(n) \in \tilde{O}(g(n))$ iff there exists a $k \geq 1$ such that $f(n) \in O(g(n) \cdot \log^k g(n)$.)

My question is, what exactly does this mean formally?

So this is another question about "asymptotic notation with multiple variables". There are
existing
questions
about this, but they don't apply to this case.

This case is different because the expression $\frac{n}{A^3 \epsilon}$ is decreasing in one of its arguments, namely $A$.
The standard definition for asymptotic notation with multiple variables
given in Wikipedia
breaks down in this case, as described below.

I also tried some "obvious" alternative definitions, which also seem fail, as described below.

(The actual algorithm that I'm wondering about is Algorithm IV in
this paper. It's an approximation algorithm that outputs an estimate of $f(X)$ for a certain function $f$, and is claimed to "run in time $\tilde{O}(\frac{n}{A^3 \epsilon})$". From context one can infer that this asymptotic claim is supposed to be true as $n \to \infty$, $A \to \infty$, and $\epsilon \to 0$. (Instead of, say, when $n \to \infty$, $A \to \infty$ and $\epsilon \to \infty$).
$A$ and $\epsilon$ determine the error bounds. The larger $A$ and $\epsilon$ are, the looser the bounds are, and the farther the output is allowed to diverge from $f(X)$.)

So, let $F(n,A,\epsilon)$ be the number of operations performed by the algorithm when run with an input of size $n$ and parameters $A$ and $\epsilon$. (If different inputs of size $n$ perform diffe

Solution

The tilde usually suppresses logarithmic factors. So in your case, the true upper bound should be something like
$$
O\left(\frac{n}{A^3 \epsilon} \bigl(\log n \log A \log \frac{1}{\epsilon}\bigr)^{O(1)} \right).
$$
(Equivalently, you can replace $\log n \log A \log \frac{1}{\epsilon}$ with $\log n + \log A + \log \frac{1}{\epsilon}$, or equivalently, with $\log \frac{nA}{\epsilon}$.)

You're right that $\tilde{O}$ notation is ambiguous, but in most cases the intended meaning is apparent.

Context

StackExchange Computer Science Q#94043, answer score: 2

Revisions (0)

No revisions yet.