Algorithm | Worst Case | Average Case | Best Case | Min. no. of swaps | Max. no. of swaps |
---|---|---|---|---|---|
Bubble | $\Theta(n^{2})$ | $\Theta(n^{2})$ | $\Theta(n)$ | 0 | $\Theta(n^{2})$ |
Selection | $\Theta(n^{2})$ | $\Theta(n^{2})$ | $\Theta(n^{2})$ | 0 | $\Theta(n)$ |
Insertion | $\Theta(n^{2})$ | $\Theta(n^{2})$ | $\Theta(n)$ | 0 | $\Theta(n^{2})$ |
Quick | $\Theta(n^{2})$ | $\Theta(n\lg n)$ | $\Theta(n\lg n)$ | 0 | $\Theta(n^{2})$ |
Merge | $\Theta(n\lg n)$ | $\Theta(n\lg n)$ | $\Theta(n\lg n)$ | Is not in-place sorting | Is not in-place sorting |
Heap | $\Theta(n\lg n)$ | $\Theta(n\lg n)$ | $\Theta(n\lg n)$ | $O(n\lg n)$ | $\Theta(n\lg n)$ |
The best case for insertion sort happens when the input array is already sorted. Each new element will then be put at the end of the array and the complexity will be $\Theta(n)$.
Interestingly, the worst case for quick sort also happens when the input array is sorted (either ascending or descending). This happens only if we choose the pivot as either the first element or the last element. In either of these case, we split the array across the pivot with n-1 elements on one array and just 1 element on the other array, which makes the complexity go to $\Theta(n^2)$
When sorting happens within the same array, its called in-place sorting. All the above sorting algorithms except merge sort are in-place sorting. In merge sort, we use an auxiliary array for merging two sub-arrays and hence its not in-place sorting.
This matters only for arrays with some element(s) repeated. If $A[x] = A[y]$, in the original array such that $x<y$, then in the sorted array, $A[x]$ must be placed at position $u$ and $A[y]$ must be placed at position $v$, such that $u<v$. i.e.; the relative positions of same value elements must be same in the sorted array as in the input array, for the sorting algorithm to be stable. In most cases stability depends on the way in which algorithm is implemented. Still, its not possible to implement heap sort as a stable sort.
Algorithm | Worst Case | Average Case | Best Case | Min. no. of swaps | Max. no. of swaps |
---|---|---|---|---|---|
Bubble | $\Theta(n^{2})$ | $\Theta(n^{2})$ | $\Theta(n)$ | 0 | $\Theta(n^{2})$ |
Selection | $\Theta(n^{2})$ | $\Theta(n^{2})$ | $\Theta(n^{2})$ | 0 | $\Theta(n)$ |
Insertion | $\Theta(n^{2})$ | $\Theta(n^{2})$ | $\Theta(n)$ | 0 | $\Theta(n^{2})$ |
Quick | $\Theta(n^{2})$ | $\Theta(n\lg n)$ | $\Theta(n\lg n)$ | 0 | $\Theta(n^{2})$ |
Merge | $\Theta(n\lg n)$ | $\Theta(n\lg n)$ | $\Theta(n\lg n)$ | Is not in-place sorting | Is not in-place sorting |
Heap | $\Theta(n\lg n)$ | $\Theta(n\lg n)$ | $\Theta(n\lg n)$ | $O(n\lg n)$ | $\Theta(n\lg n)$ |
The best case for insertion sort happens when the input array is already sorted. Each new element will then be put at the end of the array and the complexity will be $\Theta(n)$.
Interestingly, the worst case for quick sort also happens when the input array is sorted (either ascending or descending). This happens only if we choose the pivot as either the first element or the last element. In either of these case, we split the array across the pivot with n-1 elements on one array and just 1 element on the other array, which makes the complexity go to $\Theta(n^2)$
When sorting happens within the same array, its called in-place sorting. All the above sorting algorithms except merge sort are in-place sorting. In merge sort, we use an auxiliary array for merging two sub-arrays and hence its not in-place sorting.
This matters only for arrays with some element(s) repeated. If $A[x] = A[y]$, in the original array such that $x<y$, then in the sorted array, $A[x]$ must be placed at position $u$ and $A[y]$ must be placed at position $v$, such that $u<v$. i.e.; the relative positions of same value elements must be same in the sorted array as in the input array, for the sorting algorithm to be stable. In most cases stability depends on the way in which algorithm is implemented. Still, its not possible to implement heap sort as a stable sort.