Algorithms

Sorting Fundamentals

Rearranging elements into order โ€” the single most enabling operation in computer science. Sorting unlocks binary search, simplifies duplicate detection, powers merge intervals, and is the hidden first step in half of all interview problems.

Foundations ยท Array ยท Linked List ยท Stack ยท Queue ยท HashMap ยท Binary Tree ยท BST ยท Heap ยท Trie ยท Graph ยท Sorting ยท Binary Search ยท Two Pointers ยท Sliding Window ยท Dynamic Prog.
01
Section One ยท Foundation

Why Sorting Matters

UNSORTED 7 4 6 2 3 sort SORTED 2 3 4 6 7 random chaos โ†’ ordered structure โ†’ fast search, merge, dedup

Sorting transforms random data into ordered data โ€” and ordered data is fundamentally easier to work with. Once an array is sorted, binary search finds any element in O(log n) instead of O(n). Duplicate detection becomes a single linear scan of adjacent elements. The "merge intervals" pattern (LC 56) only works if intervals are sorted by start time. The "two-sum" pattern has an O(n) two-pointer variant that requires sorted input. In interviews, sorting is rarely the entire problem โ€” it's the enabling first step that simplifies everything after it. The theoretical lower bound for comparison-based sorting is ฮฉ(n log n) โ€” no comparison sort can beat this in the worst case. Merge Sort and Heap Sort achieve this bound with guaranteed O(n log n). Quick Sort achieves O(n log n) average but O(nยฒ) worst case with poor pivot selection. Non-comparison sorts (Counting Sort, Radix Sort, Bucket Sort) beat this bound by exploiting integer properties, achieving O(n+k) where k is the value range โ€” but they only apply to integers within a bounded range. Java's Arrays.sort() uses Dual-Pivot Quick Sort for primitives and TimSort (merge sort variant) for objects โ€” in interviews, call Arrays.sort() and focus on the actual problem, not reimplementing sort.

Analogy: A library with books thrown randomly on shelves. Finding any specific book requires scanning every shelf โ€” O(n). Now sort all books alphabetically. Finding any book becomes a quick binary narrowing โ€” check the middle shelf, go left or right โ€” O(log n). The one-time cost of sorting (reshelving all books) pays for itself the moment you need to search more than once. That's the trade-off: O(n log n) upfront to enable O(log n) per query forever after.
02
Section Two ยท Mental Model

How Sorting Thinks

Comparison sorts compare pairs of elements to determine order
โ–ถ
ฮฉ(n log n) lower bound โ€” n! possible orderings require log(n!) = ฮฉ(n log n) comparisons to distinguish, provable by decision tree argument
Divide-and-conquer splits the problem in half each level
โ–ถ
log n levels ร— O(n) work per level = O(n log n) โ€” the structure behind both Merge Sort and Quick Sort
Stability means equal elements preserve their original relative order
โ–ถ
Merge Sort is stable, Quick Sort is not โ€” matters when sorting objects by one key while preserving order from a previous sort by another key
In-place sorting uses O(1) extra space beyond the input
โ–ถ
Quick Sort is in-place (O(log n) stack), Merge Sort on arrays requires O(n) auxiliary space โ€” a key trade-off between time guarantee and memory
Already-sorted or nearly-sorted input is a special case
โ–ถ
Insertion Sort is O(n) on nearly-sorted data โ€” this is why TimSort (Java's Arrays.sort for objects) combines merge sort with insertion sort for small/sorted runs
Non-comparison sorts exploit integer value range instead of comparisons
โ–ถ
Counting Sort: O(n+k) time, O(k) space โ€” fast when k (max value) is small; impractical when k is huge (e.g., sorting 64-bit integers)
03
Section Three ยท O(nยฒ) Sorts

Bubble Sort & Selection Sort

Bubble Sort and Selection Sort are O(nยฒ) algorithms that nobody uses in production โ€” but they appear on interview conceptual questions and are useful for understanding sorting invariants. Know them conceptually; never implement them when asked to "sort efficiently."

Bubble Sort

Swap adjacent pairs, bubble largest to the end

for i = 0 to n-1: for j = 0 to n-i-2: if arr[j] > arr[j+1]: swap(arr[j], arr[j+1]) // O(nยฒ) time, O(1) space, stable

After pass i, the i largest elements are in their final positions at the end. Optimization: if no swaps in a pass, the array is sorted โ€” exit early (O(n) best case).

Selection Sort

Find minimum, place at front

for i = 0 to n-1: minIdx = i for j = i+1 to n-1: if arr[j] < arr[minIdx]: minIdx = j swap(arr[i], arr[minIdx]) // O(nยฒ) time, O(1) space, NOT stable

After iteration i, positions 0..i hold the i+1 smallest elements in sorted order. Always does exactly n(n-1)/2 comparisons โ€” no best-case optimization.

When O(nยฒ) is acceptable:
  • For n โ‰ค 20โ€“30 elements, Insertion Sort (next section) outperforms Merge/Quick Sort due to lower constant overhead โ€” no recursion, no auxiliary array, cache-friendly sequential access.
  • This is why TimSort and Introsort switch to insertion sort for small partitions.
04
Section Four ยท Insertion Sort

Insertion Sort โ€” The Nearly-Sorted Champion

Insertion Sort builds the sorted portion one element at a time โ€” pick the next unsorted element and insert it into the correct position by shifting larger elements right. It's O(nยฒ) worst case but O(n) on already-sorted or nearly-sorted data, making it the optimal algorithm for small arrays and the finishing step inside hybrid sorts like TimSort. Every hand of cards you've ever sorted was insertion sort โ€” pick up a card, slide it into position among the cards you already hold.

Pseudocode
function insertionSort(arr): for i = 1 to n - 1: key = arr[i] j = i - 1 while j >= 0 and arr[j] > key: arr[j + 1] = arr[j] // shift right j = j - 1 arr[j + 1] = key // insert at correct position // O(nยฒ) worst, O(n) best (sorted)
Insertion Sort โ€” inserting 3 into the sorted portion [1, 4, 7]
BEFORE โ€” key = 3 1 4 7 3 key 9 sorted portion AFTER โ€” 3 inserted between 1 and 4 1 3 4 7 9 7 shifted right, 4 shifted right, 3 placed at index 1 O(n) when input is already sorted โ€” each element compares once and stays in place
Insertion Sort is O(n) on nearly-sorted data:
  • If each element is at most k positions from its final location, insertion sort runs in O(nk). For k = O(1), it's O(n).
  • This is why it's used as the base case in TimSort and Introsort โ€” small subarrays are nearly sorted after partitioning.
05
Section Five ยท Merge Sort

Merge Sort โ€” Guaranteed O(n log n)

Merge Sort is the divide-and-conquer sorting algorithm: split the array in half, recursively sort each half, then merge the two sorted halves in O(n). It gives a guaranteed O(n log n) worst case โ€” no bad inputs can make it degrade. The merge step is the key insight: merging two sorted arrays into one sorted array costs O(n) using two pointers. The trade-off is O(n) auxiliary space for the merge buffer. Merge Sort is stable (equal elements maintain relative order) and is the foundation of Java's Arrays.sort() for objects (TimSort), external sorting (sorting data too large for RAM), and the "count inversions" interview pattern.

Pseudocode
function mergeSort(arr, left, right): if left >= right: return // base case: 0โ€“1 elements mid = (left + right) / 2 mergeSort(arr, left, mid) // sort left half mergeSort(arr, mid + 1, right) // sort right half merge(arr, left, mid, right) // merge two sorted halves function merge(arr, left, mid, right): temp = new array[right - left + 1] i = left, j = mid + 1, k = 0 while i <= mid and j <= right: if arr[i] <= arr[j]: // <= ensures stability temp[k++] = arr[i++] else: temp[k++] = arr[j++] copy remaining from left or right copy temp back to arr[left..right] // O(n) merge // Total: O(n log n), space O(n)
Merge Sort โ€” split tree and merge-back
[7, 4, 6, 2, 3, 1, 5] [7, 4, 6, 2] [3, 1, 5] [7, 4] [6, 2] [3, 1] [5] โ–ผ MERGE BACK UP โ–ผ [4, 7] [2, 6] [1, 3] [5] [2, 4, 6, 7] [1, 3, 5] โ†’ [1,2,3,4,5,6,7]
Merge Sort's merge step is the basis of "merge two sorted arrays/lists":
  • The merge operation appears independently as an interview problem (LC 88 "Merge Sorted Array," LC 21 "Merge Two Sorted Lists") and inside other algorithms (external sort, count inversions).
  • Master the two-pointer merge โ€” it appears everywhere.
06
Section Six ยท Quick Sort

Quick Sort โ€” Fast Average, Dangerous Worst

Quick Sort picks a pivot element, partitions the array into elements less than and greater than the pivot, then recursively sorts each partition. Average case O(n log n) with excellent cache performance โ€” in practice, often faster than Merge Sort despite the same asymptotic complexity. The danger: if the pivot is always the smallest or largest element (e.g., first element on already-sorted input), partitions are maximally unbalanced โ†’ O(nยฒ). Mitigation: random pivot selection or median-of-three. Quick Sort is in-place (O(log n) stack space) and is the basis of Java's Arrays.sort() for primitives (Dual-Pivot Quick Sort). Not stable.

Pseudocode
function quickSort(arr, low, high): if low >= high: return pivotIdx = partition(arr, low, high) quickSort(arr, low, pivotIdx - 1) quickSort(arr, pivotIdx + 1, high) function partition(arr, low, high): // Lomuto partition pivot = arr[high] // pivot = last element i = low - 1 // boundary of "โ‰ค pivot" zone for j = low to high - 1: if arr[j] <= pivot: i = i + 1 swap(arr[i], arr[j]) // expand "โ‰ค pivot" zone swap(arr[i + 1], arr[high]) // place pivot in final position return i + 1 // pivot index
Partition step โ€” pivot = 4, elements rearranged around it
BEFORE PARTITION (pivot = 4) 7 2 6 1 4 pivot AFTER PARTITION 2 1 4 6 7 โ‰ค pivot pivot > pivot pivot is now in its FINAL position โ€” elements left are smaller, elements right are larger recursively sort left and right partitions โ€” O(n log n) average
Common Mistake: Using the first element as pivot on already-sorted input. Every partition puts all elements on one side โ†’ depth becomes n instead of log n โ†’ O(nยฒ). Fix: pick a random pivot (swap(arr[low + rand(high-low)], arr[high])) or use median-of-three (median of first, middle, last elements).
07
Section Seven ยท Comparison

Sorting Algorithms โ€” Head to Head

Algorithm Best Average Worst Space Stable Notes
Bubble Sort O(n) O(nยฒ) O(nยฒ) O(1) โœ“ Teaching only
Selection Sort O(nยฒ) O(nยฒ) O(nยฒ) O(1) โœ— Min swaps (n-1)
Insertion Sort O(n) O(nยฒ) O(nยฒ) O(1) โœ“ Best for small/nearly-sorted
Merge Sort O(n log n) O(n log n) O(n log n) O(n) โœ“ Guaranteed performance, stable
Quick Sort O(n log n) O(n log n) O(nยฒ) O(log n) โœ— Fastest in practice, in-place
Heap Sort O(n log n) O(n log n) O(n log n) O(1) โœ— In-place + guaranteed, poor cache
Counting Sort O(n+k) O(n+k) O(n+k) O(k) โœ“ Integers only, k = value range
Radix Sort O(dยท(n+k)) O(dยท(n+k)) O(dยท(n+k)) O(n+k) โœ“ d = digits, k = base
Choose Merge Sort when

Guaranteed O(n log n) + Stability

  • Sorting linked lists (no random access needed)
  • Stability required (sort by multiple keys)
  • Worst-case guarantee matters
  • External sorting (disk-based)
Choose Quick Sort when

In-place + Fast Average

  • Array sorting with tight memory
  • Average-case speed matters most
  • In-place requirement (O(1) extra)
  • Cache performance matters (sequential access)
08
Section Eight ยท Implementation

Build It Once

Build Merge Sort and Quick Sort from scratch once โ€” it makes the mechanics concrete. In any real interview, call Arrays.sort() and focus on the actual problem.

Java โ€” Merge Sort & Quick Sort core mechanics
// โ”€โ”€ MERGE SORT โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€ O(n log n), O(n) space void mergeSort(int[] arr, int l, int r) { if (l >= r) return; int m = l + (r - l) / 2; mergeSort(arr, l, m); mergeSort(arr, m + 1, r); merge(arr, l, m, r); } void merge(int[] arr, int l, int m, int r) { int[] tmp = new int[r - l + 1]; int i = l, j = m + 1, k = 0; while (i <= m && j <= r) tmp[k++] = arr[i] <= arr[j] ? arr[i++] : arr[j++]; while (i <= m) tmp[k++] = arr[i++]; while (j <= r) tmp[k++] = arr[j++]; System.arraycopy(tmp, 0, arr, l, tmp.length); } // โ”€โ”€ QUICK SORT โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€ O(n log n) avg, O(log n) space void quickSort(int[] arr, int lo, int hi) { if (lo >= hi) return; int p = partition(arr, lo, hi); quickSort(arr, lo, p - 1); quickSort(arr, p + 1, hi); } int partition(int[] arr, int lo, int hi) { int pivot = arr[hi], i = lo - 1; for (int j = lo; j < hi; j++) if (arr[j] <= pivot) swap(arr, ++i, j); swap(arr, i + 1, hi); return i + 1; } void swap(int[] a, int i, int j) { int t = a[i]; a[i] = a[j]; a[j] = t; }
09
Section Nine ยท Java Reference

Use It In Java

IN JAVA โ€” Arrays.sort() and Collections.sort()
Primitives Arrays.sort(int[]) โ€” Dual-Pivot Quick Sort, NOT stable, O(n log n) avg
Objects Arrays.sort(T[]) / Collections.sort(List) โ€” TimSort, stable, O(n log n) guaranteed
Import import java.util.Arrays; / import java.util.Collections;
Common sorting patterns in interviews
// Sort int array ascending Arrays.sort(arr); // Sort int array descending โ€” no direct way, box first Integer[] boxed = Arrays.stream(arr).boxed().toArray(Integer[]::new); Arrays.sort(boxed, Collections.reverseOrder()); // Sort 2D array by first element (e.g., intervals) Arrays.sort(intervals, (a, b) -> a[0] - b[0]); // Sort by custom comparator Arrays.sort(arr, (a, b) -> a.length() - b.length()); // by string length Arrays.sort(arr, Comparator.comparingInt(String::length)); // Sort a range only Arrays.sort(arr, fromIndex, toIndex); // Partial sort โ€” kth smallest using PriorityQueue PriorityQueue<Integer> pq = new PriorityQueue<>(Collections.reverseOrder()); for (int x : arr) { pq.offer(x); if (pq.size() > k) pq.poll(); // O(n log k) } int kthSmallest = pq.peek();
Method Algorithm Stable Note
Arrays.sort(int[]) Dual-Pivot Quick Sort โœ— Primitives โ€” fastest for int[], long[]
Arrays.sort(Object[]) TimSort โœ“ Objects โ€” merge sort + insertion sort hybrid
Collections.sort(List) TimSort โœ“ Delegates to List.sort()
Arrays.sort(T[], Comparator) TimSort โœ“ Custom ordering โ€” lambda-friendly
โš  Gotcha: Arrays.sort(int[]) is NOT stable (Quick Sort). Arrays.sort(Integer[]) IS stable (TimSort). If you need stable sorting of primitives, box them to Integer[] first โ€” this matters when sorting intervals where ties in one key must preserve order from a previous sort.
โš  Gotcha: Never use (a, b) -> a - b as a comparator for large integers โ€” it overflows when a is large positive and b is large negative (or vice versa). Use Integer.compare(a, b) instead. For 2D arrays: (a, b) -> Integer.compare(a[0], b[0]).
10
Section Ten ยท Practice

Problems To Solve

Most interview problems don't ask you to implement a sorting algorithm โ€” they ask you to sort first, then solve the real problem using the sorted order. "Merge Intervals" sorts by start time then merges overlapping intervals. "Meeting Rooms" sorts by start time to find conflicts. "Kth Largest Element" uses Quick Select or a heap. The sorting step is O(n log n) preprocessing that enables an O(n) solution. Recognise the signal: when order doesn't matter in the input but would simplify the logic, sort first.

Difficulty Pattern Problem Key Insight
Medium sort + merge Merge Intervals โ€” LC 56 Sort intervals by start time. Scan left to right, merging overlapping intervals: if current.start โ‰ค previous.end, extend; otherwise start a new interval. O(n log n) sort + O(n) scan.
Medium sort + two-pointer 3Sum โ€” LC 15 Sort the array. For each element, use two pointers on the remaining portion to find pairs summing to the target complement. Sorting enables skipping duplicates and efficient pair-finding.
Medium quick select Kth Largest Element โ€” LC 215 Quick Select finds the kth element in O(n) average without fully sorting. Alternatively, use a min-heap of size k for O(n log k). Both are preferred over full sorting O(n log n).
Medium sort + greedy Non-overlapping Intervals โ€” LC 435 Sort by end time. Greedily pick intervals that end earliest, skipping those that overlap with the last selected. Classic activity selection problem.
Medium sort Sort Colors โ€” LC 75 Dutch National Flag problem: three-way partition in a single pass O(n) using three pointers. A counting sort also works in two passes.
Hard merge sort Count of Smaller Numbers After Self โ€” LC 315 Modified merge sort: during the merge step, count how many elements from the right half are merged before each element from the left half โ€” these are the smaller numbers after self.
Interview Pattern:
  • When the input is unsorted and the problem involves comparisons, ranges, or proximity โ€” sort first.
  • The O(n log n) sorting step is almost never the bottleneck, and the resulting ordered data simplifies the remaining logic from O(nยฒ) to O(n).
  • In Java, Arrays.sort() is one line โ€” use it freely and focus on the actual problem logic.

โ†’ See all Sorting problems with full solutions