(Stable sorting)
Line 59: Line 59:
  
 
===Stable sorting===
 
===Stable sorting===
This matters only for arrays with same element repeated. If $A[x] = A[y]$, in the original array such that $x<y$, then in the sorted array, $A[x]$ must be placed at position $u$ and $A[y]$ must be placed at position $v$, such that $u<v$. i.e.; the relative positions of same value elements must be same in the sorted array as in the input array, for the sorting algorithm to be stable. In most cases stability depends on the way in which algorithm is implemented. Still, its not possible to implement heap sort as a stable sort.
+
This matters only for arrays with some element(s) repeated. If $A[x] = A[y]$, in the original array such that $x<y$, then in the sorted array, $A[x]$ must be placed at position $u$ and $A[y]$ must be placed at position $v$, such that $u<v$. i.e.; the relative positions of same value elements must be same in the sorted array as in the input array, for the sorting algorithm to be stable. In most cases stability depends on the way in which algorithm is implemented. Still, its not possible to implement heap sort as a stable sort.
  
 
{{Template:FBD}}
 
{{Template:FBD}}
 
[[Category:Notes & Ebooks for GATE Preparation]]
 
[[Category:Notes & Ebooks for GATE Preparation]]
 
[[Category: Algorithms & Data Structures]]
 
[[Category: Algorithms & Data Structures]]

Revision as of 15:48, 2 January 2014

Algorithm Worst Case Average Case Best Case Min. no. of swaps Max. no. of swaps
Bubble $\theta(n^{2})$ $\theta(n^{2})$ $\theta(n^{2})$ $\theta(n^{2})$ $\theta(n^{2})$
Selection $\theta(n^{2})$ $\theta(n^{2})$ $\theta(n^{2})$ 0 $\theta(n^{2})$
Insertion $\theta(n^{2})$ $\theta(n^{2})$ $\theta(n)$ 0 $\theta(n^{2})$
Quick $\theta(n^{2})$ $\theta(n\lg n)$ $\theta(n\lg n)$ 0 $\theta(n^{2})$
Merge $\theta(n\lg n)$ $\theta(n\lg n)$ $\theta(n\lg n)$ Is not in-place sorting Is not in-place sorting
Heap $\theta(n\lg n)$ $\theta(n\lg n)$ $\theta(n\lg n)$ $O(n\lg n)$ $\theta(n\lg n)$

The best case for insertion sort happens when the input array is already sorted. Each new element will then be put at the end of the array and the complexity will be $O(n)$.

Interestingly, the worst case for quick sort also happens when the input array is sorted (either ascending or descending). This happens only if we choose the pivot as either the first element or the last element. In either of these case, we split the array across the pivot with n-1 elements on one array and just 1 element on the other array, which makes the complexity go to $n^2$

In-Place Sorting

When sorting happens within the same array, its called in-place sorting. All the above sorting algorithms except merge sort are in-place sorting. In merge sort, we use an auxiliary array for merging two sub-arrays and hence its not in-place sorting.

Stable sorting

This matters only for arrays with some element(s) repeated. If $A[x] = A[y]$, in the original array such that $x<y$, then in the sorted array, $A[x]$ must be placed at position $u$ and $A[y]$ must be placed at position $v$, such that $u<v$. i.e.; the relative positions of same value elements must be same in the sorted array as in the input array, for the sorting algorithm to be stable. In most cases stability depends on the way in which algorithm is implemented. Still, its not possible to implement heap sort as a stable sort.




blog comments powered by Disqus
Algorithm Worst Case Average Case Best Case Min. no. of swaps Max. no. of swaps
Bubble $\theta(n^{2})$ $\theta(n^{2})$ $\theta(n^{2})$ $\theta(n^{2})$ $\theta(n^{2})$
Selection $\theta(n^{2})$ $\theta(n^{2})$ $\theta(n^{2})$ 0 $\theta(n^{2})$
Insertion $\theta(n^{2})$ $\theta(n^{2})$ $\theta(n)$ 0 $\theta(n^{2})$
Quick $\theta(n^{2})$ $\theta(n\lg n)$ $\theta(n\lg n)$ 0 $\theta(n^{2})$
Merge $\theta(n\lg n)$ $\theta(n\lg n)$ $\theta(n\lg n)$ Is not in-place sorting Is not in-place sorting
Heap $\theta(n\lg n)$ $\theta(n\lg n)$ $\theta(n\lg n)$ $O(n\lg n)$ $\theta(n\lg n)$

The best case for insertion sort happens when the input array is already sorted. Each new element will then be put at the end of the array and the complexity will be $O(n)$.

Interestingly, the worst case for quick sort also happens when the input array is sorted (either ascending or descending). This happens only if we choose the pivot as either the first element or the last element. In either of these case, we split the array across the pivot with n-1 elements on one array and just 1 element on the other array, which makes the complexity go to $n^2$

In-Place Sorting[edit]

When sorting happens within the same array, its called in-place sorting. All the above sorting algorithms except merge sort are in-place sorting. In merge sort, we use an auxiliary array for merging two sub-arrays and hence its not in-place sorting.

Stable sorting[edit]

This matters only for arrays with same element repeated. If $A[x] = A[y]$, in the original array such that $x<y$, then in the sorted array, $A[x]$ must be placed at position $u$ and $A[y]$ must be placed at position $v$, such that $u<v$. i.e.; the relative positions of same value elements must be same in the sorted array as in the input array, for the sorting algorithm to be stable. In most cases stability depends on the way in which algorithm is implemented. Still, its not possible to implement heap sort as a stable sort.




blog comments powered by Disqus