gotchaModerate
Is there a meaningful difference between O(1) and O(log n)?
Viewed 0 times
logdifferencebetweenandtheremeaningful
Problem
A computer can only process numbers smaller than say $2^{64}$ in a single operation, so even an $O(1)$ algorithm only takes constant time if $n<2^{64}$. If I somehow had an array of $2^{1000}$ elements to process, even an $O(1)$ operation such as an index lookup will start to take longer as the index has to be calculated in multiple operations. I think it will take at least $O(\log n)$.
Similarly even if an algorithm has $O(\log n)$ complexity, $\log n$ cannot possibly grow larger than about a hundred, so could be ignored as no larger than a small constant.
So, is it really meaningful to treat $O(1)$ and $O(\log n)$ as different?
The same applies of any difference of $\log n$, like between $O(n)$, $O(n\log n)$ and $O(n/\log n)$.
Similarly even if an algorithm has $O(\log n)$ complexity, $\log n$ cannot possibly grow larger than about a hundred, so could be ignored as no larger than a small constant.
So, is it really meaningful to treat $O(1)$ and $O(\log n)$ as different?
The same applies of any difference of $\log n$, like between $O(n)$, $O(n\log n)$ and $O(n/\log n)$.
Solution
There are actually all sorts of cases where $\log n$ gets way bigger than 100. For example, if you're working with variable-length integers - say, for cryptography - $\log n$ represents the number of bits in the integer, which can get up to 4096 or higher in practice. Those bits will be spread out over many machine words, so the $\log n$ term refers to how many machine words the number occupies and therefore definitely has an impact on the runtime.
Similarly, $O(1)$ does not mean "constant time only if $n < 2^{64}$." Yes, the computer has a fixed word size, but that doesn't mean that an $O(1)$-time algorithm given a gigantic input can't run in a fixed amount of time. Taking large integers as an example, consider the problem of checking whether an integer is even or odd. That just requires you to look at the last digit, which takes time $O(1)$ even if the number is spread out over thousands of machine words.
The last detail is that typically, we assume that we work with a machine model where the word size scales with the log of the size of the input. This allowance a to safely assume that you can do array accesses in time $O(1)$ because array indices are assumed to fit into a constant number of machine words.
Similarly, $O(1)$ does not mean "constant time only if $n < 2^{64}$." Yes, the computer has a fixed word size, but that doesn't mean that an $O(1)$-time algorithm given a gigantic input can't run in a fixed amount of time. Taking large integers as an example, consider the problem of checking whether an integer is even or odd. That just requires you to look at the last digit, which takes time $O(1)$ even if the number is spread out over thousands of machine words.
The last detail is that typically, we assume that we work with a machine model where the word size scales with the log of the size of the input. This allowance a to safely assume that you can do array accesses in time $O(1)$ because array indices are assumed to fit into a constant number of machine words.
Context
StackExchange Computer Science Q#57814, answer score: 12
Revisions (0)
No revisions yet.