patternMinor
What does does $O$ mean in this context?
Viewed 0 times
thiswhatmeandoescontext
Problem
I understand big O notation in computational complexity theory, but I don't see how it applies in the equation below.
From Pattern Recognition and Machine Learning:
If we weren't familiar with the rules of ordinary calculus, we could
evaluate a conventional derivative $dy/dx$ by making a small change
to the variable $x$ and then expanding in powers of $\epsilon$, so that
$y(x + \epsilon) = y(x) + \frac {dx}{dy}\epsilon + O(\epsilon^2)$
and finally taking the limit $\lim\epsilon\to 0$.
From Pattern Recognition and Machine Learning:
If we weren't familiar with the rules of ordinary calculus, we could
evaluate a conventional derivative $dy/dx$ by making a small change
to the variable $x$ and then expanding in powers of $\epsilon$, so that
$y(x + \epsilon) = y(x) + \frac {dx}{dy}\epsilon + O(\epsilon^2)$
and finally taking the limit $\lim\epsilon\to 0$.
Solution
It means exactly the same thing. It means there is a constant $c$ such that the term is $\le c \cdot \epsilon^2$ (where $c$ does not depend on $\epsilon$). Big-O notation is not unique to computer science; it is also widely used in mathematics. See https://en.wikipedia.org/wiki/Big_O_notation, which explains some ways it is used in mathematics.
Context
StackExchange Computer Science Q#38434, answer score: 3
Revisions (0)
No revisions yet.