snippetModerate
Why is the time complexity of insertion sort not brought down even if we use binary search for the comparisons?
Viewed 0 times
insertionwhythesearchbroughttimedownbinaryforeven
Problem
There are two factors that decide the running time of the insertion sort algorithm:
the number of comparisons, and the number of movements.
In the case of number of comparisons, the sorted part (left side of $j$) of the array is searched linearly for the right place of the $j^{th}$ element. If instead, we use a binary search, then the time complexity of finding a place for the $j^{th}$ element comes down from $\operatorname{O}(n)$ to $\operatorname{O}(\log n)$. So, for all the $n$ elements, the time complexity for comparisons becomes $\operatorname{O}(n \log n)$. Even so, the number of movements is still going to take $\operatorname{O}(n)$ time, and the total time complexity isn't brought down and remains $\operatorname{O}(n^2)$. Why is that?
Are any of my statements wrong assumptions?
Can a possible explanation be: the total time complexity isn't brought down and remains $\operatorname{O}(n^2)$. This is because to search an element (using binary search) it takes $\operatorname{O}(\log n)$ time, and to move the elements it takes $\operatorname{O}(n)$ time. Total cost is $\operatorname{O}(\log n)+\operatorname{O}(n)=\operatorname{O}(n)$ time. To do this for $n-1$ elements, it takes $n(n-1)=\operatorname{O}(n^2)$ time?
the number of comparisons, and the number of movements.
In the case of number of comparisons, the sorted part (left side of $j$) of the array is searched linearly for the right place of the $j^{th}$ element. If instead, we use a binary search, then the time complexity of finding a place for the $j^{th}$ element comes down from $\operatorname{O}(n)$ to $\operatorname{O}(\log n)$. So, for all the $n$ elements, the time complexity for comparisons becomes $\operatorname{O}(n \log n)$. Even so, the number of movements is still going to take $\operatorname{O}(n)$ time, and the total time complexity isn't brought down and remains $\operatorname{O}(n^2)$. Why is that?
Are any of my statements wrong assumptions?
Can a possible explanation be: the total time complexity isn't brought down and remains $\operatorname{O}(n^2)$. This is because to search an element (using binary search) it takes $\operatorname{O}(\log n)$ time, and to move the elements it takes $\operatorname{O}(n)$ time. Total cost is $\operatorname{O}(\log n)+\operatorname{O}(n)=\operatorname{O}(n)$ time. To do this for $n-1$ elements, it takes $n(n-1)=\operatorname{O}(n^2)$ time?
Solution
For the $j^{th}$ element, you would do ~ $\log j$ comparisons and (in the worst case) ~$j$ shifts.
Summing over $j$, you get
$$
\sum_{j = 1}^{n} (j + \log j) = \frac{n(n+1)}{2} + \log (n!) = O(n^2 + n \log n) = O(n^2)
$$
The idea is that the linear work of shifting trumps the logarithmic work of comparing. You end up doing less comparisons, but still a linear amount of work per iteration. So the complexity does not change.
Summing over $j$, you get
$$
\sum_{j = 1}^{n} (j + \log j) = \frac{n(n+1)}{2} + \log (n!) = O(n^2 + n \log n) = O(n^2)
$$
The idea is that the linear work of shifting trumps the logarithmic work of comparing. You end up doing less comparisons, but still a linear amount of work per iteration. So the complexity does not change.
Context
StackExchange Computer Science Q#65568, answer score: 10
Revisions (0)
No revisions yet.