patternMinor
Is the runtime complexity of sorting $O(n\log n)$ or $O(n\log^2 n)$?
Viewed 0 times
sortingthelogruntimecomplexity
Problem
Suppose we want to sort an array that contains $n$ different integers in the range $[1,2n]$. It is known that this requires $\Theta(n\log n)$ comparisons. But comparing integers which might be as large as $n$ might requires time $\Theta(\log n)$ since we have to compare them bitwise. So, apparently the runtime complexity of sorting is $\Theta(n\log^2 n)$. Is this correct? If so, why is it taught that the complexity of sorting is $\Theta(n\log n)$? Is there a better algorithm for sorting which runs in time $\Theta(n\log n)$?
EDIT: I understand that the question depends on the computational model. In fact, I just found an interesting unanswered question that specifically asks about the runtime complexity of sorting on a Turing machine. Originally, I had in mind a realistic computer, only with unlimited memory. In such a computer, we can represent arbitrarily large integers (as sequences of bytes). We are given an array that contains $n$ different integers, each of which is between $1$ and $2n$. We measure the clock-time it takes to sort this array, as a function of $n$. What function will we see?
EDIT: I understand that the question depends on the computational model. In fact, I just found an interesting unanswered question that specifically asks about the runtime complexity of sorting on a Turing machine. Originally, I had in mind a realistic computer, only with unlimited memory. In such a computer, we can represent arbitrarily large integers (as sequences of bytes). We are given an array that contains $n$ different integers, each of which is between $1$ and $2n$. We measure the clock-time it takes to sort this array, as a function of $n$. What function will we see?
Solution
Algorithms are usually analyzed using the random access machine model. In this model, arithmetic operations on machine words take time $O(1)$. A machine word contains $O(\log n)$ bits, where $n$ is some natural parameter, say the length of the input (in machine words!).
Given this definition, comparison-based sorting algorithms run in time $O(n\log n)$ as long as the arguments fit in a constant number of machine words.
Sorting algorithms probably take more than $n\log^2 n$ time on a Turing machine, since Turing machines have no random access. Turing machines are not so interesting, however, for analyzing the running time of algorithms, except for special circumstances such as proving results on the Turing machine model itself.
In your particular case, you can actually sort the array in $O(n)$ using counting sort. The $\Omega(n\log n)$ lower bound only holds for some models such as the comparison model.
Given this definition, comparison-based sorting algorithms run in time $O(n\log n)$ as long as the arguments fit in a constant number of machine words.
Sorting algorithms probably take more than $n\log^2 n$ time on a Turing machine, since Turing machines have no random access. Turing machines are not so interesting, however, for analyzing the running time of algorithms, except for special circumstances such as proving results on the Turing machine model itself.
In your particular case, you can actually sort the array in $O(n)$ using counting sort. The $\Omega(n\log n)$ lower bound only holds for some models such as the comparison model.
Context
StackExchange Computer Science Q#71291, answer score: 6
Revisions (0)
No revisions yet.