1. Selection Sort ⭐ S68
1. Verbal Description:
Repeatedly finds the minimum element and
swaps it to the front.
selectionSort(S, len):
for i from 0 to len-1:
mini = indexOfMin(S, i, len)
swap(S, i, mini)
2. Analysis:
$W(n) = A(n) = B(n) = \Theta(n^2)$. Exactly $n(n-1)/2$
comparisons. $S(n) = O(1)$. Unstable.
2. Insertion Sort ⭐ S70
1. Verbal Description:
Builds a sorted partition by inserting each
element into its correct place.
insertionSort(arr, len):
for next from 1 to len-1:
curr = next; temp = arr[next]
while(curr > 0 && temp < arr[curr-1]):
arr[curr] = arr[curr-1]; curr--
arr[curr] = temp
2. Analysis:
$W(n) = \Theta(n^2)$, $B(n) = \Theta(n)$ (adaptive). $S(n) =
O(1)$. Stable.
3. Merge Sort ⭐ S76
mergeSort(arr, l, r):
if(l < r):
m = (l + r) / 2
mergeSort(arr, l, m); mergeSort(arr, m+1, r)
merge(arr, l, m, r)
Analysis:
$W(n) = \Theta(n \log n)$. $S(n) = O(n)$ (not in-place).
Data Structure Choice: Using **linked lists** avoids the $O(n)$ space
requirement for an auxiliary array, preventing "unacceptably high" space complexity.
Stable
4. Comparison Summary
| Metric |
Best Case $B(n)$ |
Worst Case $T(n)$ |
| Recurrence |
$2B(n/2) + n/2$ |
$2T(n/2) + n - 1$ |
| Closed Form ($n = 2^k$) |
$\frac{n}{2} \lg n$ |
$n \lg n - n + 1$ |
| Big-Theta |
$\Theta(n \lg n)$ |
$\Theta(n \lg n)$ |
4. Partition (Lecture) ⭐ S93
1. Verbal Description:
Rearranges elements around a pivot (a[l]).
Returns index p where a[l..p-1] ≤ a[p] and a[p+1..r] > a[p].
partition(a, l, r):
i = l+1; j = r; m = a[l]
do:
while(i < r && a[i] <= m) i++
while(j > i && a[j] >= m) j--
if(i < j) swap(a[i], a[j])
while(i < j)
swap(a[l], a[j]); return j
Pivot Properties (after):
- $a[j] = m$ (pivot in its final sorted position)
- $a[l..j-1] \le m$ (left side)
- $a[j+1..r] \ge m$ (right side)
5. QuickSort ⭐ S93
quickSort(a, l, r):
if(l < r):
p = partition(a, l, r)
quickSort(a, l, p-1); quickSort(a, p+1, r)
Analysis:
$W(n) = \Theta(n^2)$, $A(n) = \Theta(n \log n)$. $S(n) =
O(\log n)$. Unstable.
In-Place Efficiency: QuickSort is inherently efficient because the
Partition procedure works in-place (constant auxiliary
space for comparison/swapping), making it practical for large datasets.
6. Binary Heap Foundation ⭐ S150
siftDown(i):
l = 2i+1; r = 2i+2; min = i
if(l <= n && heap[l] < heap[min]) min = l
if(r <= n && heap[r] < heap[min]) min = r
if(min != i): swap(i, min); siftDown(min)
buildHeap: for(i = n/2-1; i >= 0; i--) siftDown(i)
7. HeapSort ⭐ S158
heapSort(arr, n):
buildHeap(arr)
for(i = n-1; i > 0; i--):
swap(arr[0], arr[i]); siftDown(0)
Analysis:
$W(n) = A(n) = \Theta(n \log n)$. $S(n) = O(1)$ (in-place).
Unstable.
8. CountSort ⭐ S106
countSort(a, n, k):
// k = magnitude of range [0, k]
C = new Array(k+1) // Auxiliary space: O(k)
for(i=0; i<n; i++) C[a[i]]++
for(i=1; i<=k; i++) C[i] += C[i-1] // cumulative counts
for(i=n-1; i>=0; i--) B[--C[a[i]]] = a[i] // place stably
Analysis:
Time: $W(n) = \Theta(n + k)$. Space: $S(n) = O(k)$ auxiliary
(count array) $+ O(n)$ (output array). Stable.
9. Radix Sort ⭐ S109
radixSort(a, d):
for i from 1 to d:
stablySortByDigit(a, i)
Analysis:
$W(n) = \Theta(d(n + k))$. Requires a stable underlying sort.
Stable.
| Algorithm |
W(n) |
A(n) |
B(n) |
Space |
Stable? |
Notes |
| Selection Sort ⭐ [S67] |
O(n²) |
O(n²) |
O(n²) |
O(1) |
No |
Find min, swap to front |
| Insertion Sort ⭐ [S72] |
O(n²) |
O(n²) |
O(n) |
O(1) |
Yes |
Best on nearly sorted |
| Merge Sort ⭐ [S78] |
O(n log n) |
O(n log n) |
O(n log n) |
O(n) |
Yes |
Divide & Conquer |
| QuickSort [S92] |
O(n²) |
O(n log n) |
O(n log n) |
O(log n) |
No |
Worst on sorted input |
| HeapSort [S163] |
O(n log n) |
O(n log n) |
O(n log n) |
O(1) |
No |
In-place, uses heap |
| CountSort ⭐ [S106] |
O(n + k) |
O(n + k) |
O(n + k) |
O(k) |
Yes |
k = range of values |
| RadixSort ⭐ [S108] |
O(d·n) |
O(d·n) |
O(d·n) |
O(n) |
Yes |
d = number of digits |
⚖️ What is Stability? S91
A sorting algorithm is stable if it preserves the relative
order of elements with equal keys.
Example: [A1,
B, A2] sorted by letter → Stable: [A1, A2, B] | Unstable:
[A2, A1, B]
Iterative Sorting: If an algorithm is stable, it is possible to sort records by
multiple attributes in iterations (from least significant to most significant
attribute) without destroying the sorting order of the previous iteration.
Stable Algorithms (Yes)
- Insertion Sort: Only shifts elements strictly smaller. Equal elements
stay in original relative order.
- Merge Sort: Stable if the merge function prefers the left
element in case of a tie (
L[i] <= R[j]).
- CountSort: Processes the input array backwards
(right-to-left) to place elements in the output correctly.
- RadixSort: Depends on the stability of the underlying sort (usually
CountSort) to work correctly.
Unstable Algorithms (No)
- Selection Sort: The long-distance swap of the minimum
element can jump it over an equal element elsewhere.
- QuickSort: The Partition function swaps elements
across the pivot, often mixing up the relative order of identical values.
- HeapSort: Building the heap and repeatedly extracting the max/min
involves many non-local swaps that destroy stability.
⚠️ Comparison Sort Lower Bound
Any comparison-based sort requires $\Omega(n \log n)$ comparisons in the worst
case.
Decision Tree Model:
- Algorithm follows a binary tree (comparisons = nodes).
- Number of reachable leaves $L$ must be $\ge n!$ (to cover all input permutations).
- Tree height $h$ (max comparisons): $2^h \ge L \ge n! \implies h \ge \log_2(n!)$.
- Using Stirling's Approximation: $\log_2(n!) \approx n \log_2 n - 1.44n
\implies \Omega(n \lg n)$.
Non-comparison algorithms like CountSort bypass
this by using values as indices.
Merge Function ⭐
Merges two sorted segments. Max comparisons = n + m - 1. Time: O(n+m).