(6 intermediate revisions by the same user not shown)
Line 1: Line 1:
 
<metadesc>Complexities of basic sorting algorithms. Minimum no. of swaps required for sorting algorithms  </metadesc>
 
<metadesc>Complexities of basic sorting algorithms. Minimum no. of swaps required for sorting algorithms  </metadesc>
 +
 +
{{alert|Please don't by heart this table. This is just to check your understanding and you should know how these complexities came|alert-danger}}
 
{|class="wikitable" style="color: black; text-align: center; " width="85%"
 
{|class="wikitable" style="color: black; text-align: center; " width="85%"
 
!align = "center" |Algorithm
 
!align = "center" |Algorithm
Line 9: Line 11:
 
|-
 
|-
 
! align = "center"|[http://en.wikipedia.org/wiki/Bubble_sort Bubble]
 
! align = "center"|[http://en.wikipedia.org/wiki/Bubble_sort Bubble]
! align = "center"|$\theta(n^{2})$
+
! align = "center"|$\Theta(n^{2})$
! align = "center"|$\theta(n^{2})$
+
! align = "center"|$\Theta(n^{2})$
! align = "center"|$\theta(n^{2})$
+
! align = "center"|$\Theta(n)$
 
! align = "center"|0
 
! align = "center"|0
! align = "center" |$\theta(n^{2})$
+
! align = "center" |$\Theta(n^{2})$
 
|-
 
|-
 
! align = "center"|[http://en.wikipedia.org/wiki/Selection_sort Selection]
 
! align = "center"|[http://en.wikipedia.org/wiki/Selection_sort Selection]
! align = "center"|$\theta(n^{2})$
+
! align = "center"|$\Theta(n^{2})$
! align = "center"|$\theta(n^{2})$
+
! align = "center"|$\Theta(n^{2})$
! align = "center"|$\theta(n^{2})$
+
! align = "center"|$\Theta(n^{2})$
 
! align = "center"|0  
 
! align = "center"|0  
! align = "center"|$\theta(n^{2})$
+
! align = "center"|$\Theta(n)$
 
|-
 
|-
 
! align = "center"|[http://en.wikipedia.org/wiki/Insertion_sort Insertion]
 
! align = "center"|[http://en.wikipedia.org/wiki/Insertion_sort Insertion]
! align = "center"|$\theta(n^{2})$
+
! align = "center"|$\Theta(n^{2})$
! align = "center"|$\theta(n^{2})$
+
! align = "center"|$\Theta(n^{2})$
! align = "center"|$\theta(n)$  
+
! align = "center"|$\Theta(n)$  
 
! align = "center"|0  
 
! align = "center"|0  
! align = "center"|$\theta(n^{2})$
+
! align = "center"|$\Theta(n^{2})$
 
|-
 
|-
 
! align = "center"|[http://en.wikipedia.org/wiki/Quick_sort Quick]
 
! align = "center"|[http://en.wikipedia.org/wiki/Quick_sort Quick]
! align = "center"|$\theta(n^{2})$  
+
! align = "center"|$\Theta(n^{2})$  
! align = "center"|$\theta(n\lg n)$
+
! align = "center"|$\Theta(n\lg n)$
! align = "center"|$\theta(n\lg n)$
+
! align = "center"|$\Theta(n\lg n)$
 
! align = "center"|0  
 
! align = "center"|0  
! align = "center"|$\theta(n^{2})$
+
! align = "center"|$\Theta(n^{2})$
 
|-
 
|-
 
! align = "center"|[http://en.wikipedia.org/wiki/Merge_sort Merge]
 
! align = "center"|[http://en.wikipedia.org/wiki/Merge_sort Merge]
! align = "center"|$\theta(n\lg n)$
+
! align = "center"|$\Theta(n\lg n)$
! align = "center"|$\theta(n\lg n)$
+
! align = "center"|$\Theta(n\lg n)$
! align = "center"|$\theta(n\lg n)$
+
! align = "center"|$\Theta(n\lg n)$
 
! align = "center"|Is not in-place sorting
 
! align = "center"|Is not in-place sorting
 
! align = "center"|Is not in-place sorting
 
! align = "center"|Is not in-place sorting
 
|-
 
|-
 
! align = "center"|[http://en.wikipedia.org/wiki/Heap_sort Heap]
 
! align = "center"|[http://en.wikipedia.org/wiki/Heap_sort Heap]
! align = "center"|$\theta(n\lg n)$
+
! align = "center"|$\Theta(n\lg n)$
! align = "center"|$\theta(n\lg n)$
+
! align = "center"|$\Theta(n\lg n)$
! align = "center"|$\theta(n\lg n)$
+
! align = "center"|$\Theta(n\lg n)$
 
! align = "center"|$O(n\lg n)$
 
! align = "center"|$O(n\lg n)$
! align = "center"|$\theta(n\lg n)$
+
! align = "center"|$\Theta(n\lg n)$
 
|}
 
|}
  
The best case for insertion sort happens when the input array is already sorted. Each new element will then be put at the end of the array and the complexity will be $\theta(n)$.  
+
The best case for insertion sort happens when the input array is already sorted. Each new element will then be put at the end of the array and the complexity will be $\Theta(n)$.  
  
Interestingly, the worst case for quick sort also happens when the input array is sorted (either ascending or descending). This happens only if we choose the pivot as either the first element or the last element. In either of these case, we split the array across the pivot with n-1 elements on one array and just 1 element on the other array, which makes the complexity go to $\theta(n^2)$
+
Interestingly, the worst case for quick sort also happens when the input array is sorted (either ascending or descending). This happens only if we choose the pivot as either the first element or the last element. In either of these case, we split the array across the pivot with n-1 elements on one array and just 1 element on the other array, which makes the complexity go to $\Theta(n^2)$
  
 
===In-Place Sorting===
 
===In-Place Sorting===
Line 62: Line 64:
  
 
{{Template:FBD}}
 
{{Template:FBD}}
[[Category:Notes & Ebooks for GATE Preparation]]
+
 
[[Category: Algorithms & Data Structures]]
+
[[Category: Algorithms & Data Structures Notes]]
  
 
[[Category: Compact Notes for Reference of Understanding]]
 
[[Category: Compact Notes for Reference of Understanding]]

Latest revision as of 15:20, 15 February 2016


Heads Up! Please don't by heart this table. This is just to check your understanding and you should know how these complexities came
Algorithm Worst Case Average Case Best Case Min. no. of swaps Max. no. of swaps
Bubble $\Theta(n^{2})$ $\Theta(n^{2})$ $\Theta(n)$ 0 $\Theta(n^{2})$
Selection $\Theta(n^{2})$ $\Theta(n^{2})$ $\Theta(n^{2})$ 0 $\Theta(n)$
Insertion $\Theta(n^{2})$ $\Theta(n^{2})$ $\Theta(n)$ 0 $\Theta(n^{2})$
Quick $\Theta(n^{2})$ $\Theta(n\lg n)$ $\Theta(n\lg n)$ 0 $\Theta(n^{2})$
Merge $\Theta(n\lg n)$ $\Theta(n\lg n)$ $\Theta(n\lg n)$ Is not in-place sorting Is not in-place sorting
Heap $\Theta(n\lg n)$ $\Theta(n\lg n)$ $\Theta(n\lg n)$ $O(n\lg n)$ $\Theta(n\lg n)$

The best case for insertion sort happens when the input array is already sorted. Each new element will then be put at the end of the array and the complexity will be $\Theta(n)$.

Interestingly, the worst case for quick sort also happens when the input array is sorted (either ascending or descending). This happens only if we choose the pivot as either the first element or the last element. In either of these case, we split the array across the pivot with n-1 elements on one array and just 1 element on the other array, which makes the complexity go to $\Theta(n^2)$

In-Place Sorting

When sorting happens within the same array, its called in-place sorting. All the above sorting algorithms except merge sort are in-place sorting. In merge sort, we use an auxiliary array for merging two sub-arrays and hence its not in-place sorting.

Stable sorting

This matters only for arrays with some element(s) repeated. If $A[x] = A[y]$, in the original array such that $x<y$, then in the sorted array, $A[x]$ must be placed at position $u$ and $A[y]$ must be placed at position $v$, such that $u<v$. i.e.; the relative positions of same value elements must be same in the sorted array as in the input array, for the sorting algorithm to be stable. In most cases stability depends on the way in which algorithm is implemented. Still, its not possible to implement heap sort as a stable sort.




blog comments powered by Disqus
Algorithm Worst Case Average Case Best Case Min. no. of swaps Max. no. of swaps
Bubble $\theta(n^{2})$ $\theta(n^{2})$ $\theta(n^{2})$ 0 $\theta(n^{2})$
Selection $\theta(n^{2})$ $\theta(n^{2})$ $\theta(n^{2})$ 0 $\theta(n^{2})$
Insertion $\theta(n^{2})$ $\theta(n^{2})$ $\theta(n)$ 0 $\theta(n^{2})$
Quick $\theta(n^{2})$ $\theta(n\lg n)$ $\theta(n\lg n)$ 0 $\theta(n^{2})$
Merge $\theta(n\lg n)$ $\theta(n\lg n)$ $\theta(n\lg n)$ Is not in-place sorting Is not in-place sorting
Heap $\theta(n\lg n)$ $\theta(n\lg n)$ $\theta(n\lg n)$ $O(n\lg n)$ $\theta(n\lg n)$

The best case for insertion sort happens when the input array is already sorted. Each new element will then be put at the end of the array and the complexity will be $\theta(n)$.

Interestingly, the worst case for quick sort also happens when the input array is sorted (either ascending or descending). This happens only if we choose the pivot as either the first element or the last element. In either of these case, we split the array across the pivot with n-1 elements on one array and just 1 element on the other array, which makes the complexity go to $\theta(n^2)$

In-Place Sorting[edit]

When sorting happens within the same array, its called in-place sorting. All the above sorting algorithms except merge sort are in-place sorting. In merge sort, we use an auxiliary array for merging two sub-arrays and hence its not in-place sorting.

Stable sorting[edit]

This matters only for arrays with some element(s) repeated. If $A[x] = A[y]$, in the original array such that $x<y$, then in the sorted array, $A[x]$ must be placed at position $u$ and $A[y]$ must be placed at position $v$, such that $u<v$. i.e.; the relative positions of same value elements must be same in the sorted array as in the input array, for the sorting algorithm to be stable. In most cases stability depends on the way in which algorithm is implemented. Still, its not possible to implement heap sort as a stable sort.




blog comments powered by Disqus