Sorting rearranges elements into order โ the foundation for binary search, merge intervals, and half of all interview problems. Master the O(n log n) algorithms and know when O(nยฒ) is acceptable.
Algorithms
Sorting Fundamentals
Rearranging elements into order โ the single most enabling operation in computer science. Sorting unlocks binary search, simplifies duplicate detection, powers merge intervals, and is the hidden first step in half of all interview problems.
Sorting transforms random data into ordered data โ and ordered data is fundamentally easier to work with. Once an array is sorted, binary search finds any element in O(log n) instead of O(n). Duplicate detection becomes a single linear scan of adjacent elements. The "merge intervals" pattern (LC 56) only works if intervals are sorted by start time. The "two-sum" pattern has an O(n) two-pointer variant that requires sorted input. In interviews, sorting is rarely the entire problem โ it's the enabling first step that simplifies everything after it. The theoretical lower bound for comparison-based sorting is ฮฉ(n log n) โ no comparison sort can beat this in the worst case. Merge Sort and Heap Sort achieve this bound with guaranteed O(n log n). Quick Sort achieves O(n log n) average but O(nยฒ) worst case with poor pivot selection. Non-comparison sorts (Counting Sort, Radix Sort, Bucket Sort) beat this bound by exploiting integer properties, achieving O(n+k) where k is the value range โ but they only apply to integers within a bounded range. Java's Arrays.sort() uses Dual-Pivot Quick Sort for primitives and TimSort (merge sort variant) for objects โ in interviews, call Arrays.sort() and focus on the actual problem, not reimplementing sort.
Analogy: A library with books thrown randomly on shelves. Finding any specific book requires scanning every shelf โ O(n). Now sort all books alphabetically. Finding any book becomes a quick binary narrowing โ check the middle shelf, go left or right โ O(log n). The one-time cost of sorting (reshelving all books) pays for itself the moment you need to search more than once. That's the trade-off: O(n log n) upfront to enable O(log n) per query forever after.
02
Section Two ยท Mental Model
How Sorting Thinks
Comparison sorts compare pairs of elements to determine order
โถ
ฮฉ(n log n) lower bound โ n! possible orderings require log(n!) = ฮฉ(n log n) comparisons to distinguish, provable by decision tree argument
Divide-and-conquer splits the problem in half each level
โถ
log n levels ร O(n) work per level = O(n log n) โ the structure behind both Merge Sort and Quick Sort
Stability means equal elements preserve their original relative order
โถ
Merge Sort is stable, Quick Sort is not โ matters when sorting objects by one key while preserving order from a previous sort by another key
In-place sorting uses O(1) extra space beyond the input
โถ
Quick Sort is in-place (O(log n) stack), Merge Sort on arrays requires O(n) auxiliary space โ a key trade-off between time guarantee and memory
Already-sorted or nearly-sorted input is a special case
โถ
Insertion Sort is O(n) on nearly-sorted data โ this is why TimSort (Java's Arrays.sort for objects) combines merge sort with insertion sort for small/sorted runs
Non-comparison sorts exploit integer value range instead of comparisons
โถ
Counting Sort: O(n+k) time, O(k) space โ fast when k (max value) is small; impractical when k is huge (e.g., sorting 64-bit integers)
03
Section Three ยท O(nยฒ) Sorts
Bubble Sort & Selection Sort
Bubble Sort and Selection Sort are O(nยฒ) algorithms that nobody uses in production โ but they appear on interview conceptual questions and are useful for understanding sorting invariants. Know them conceptually; never implement them when asked to "sort efficiently."
Bubble Sort
Swap adjacent pairs, bubble largest to the end
for i = 0 to n-1:
for j = 0 to n-i-2:
if arr[j] > arr[j+1]:
swap(arr[j], arr[j+1])
// O(nยฒ) time, O(1) space, stable
After pass i, the i largest elements are in their final positions at the end. Optimization: if no swaps in a pass, the array is sorted โ exit early (O(n) best case).
Selection Sort
Find minimum, place at front
for i = 0 to n-1:
minIdx = i
for j = i+1 to n-1:
if arr[j] < arr[minIdx]:
minIdx = j
swap(arr[i], arr[minIdx])
// O(nยฒ) time, O(1) space, NOT stable
After iteration i, positions 0..i hold the i+1 smallest elements in sorted order. Always does exactly n(n-1)/2 comparisons โ no best-case optimization.
When O(nยฒ) is acceptable:
For n โค 20โ30 elements, Insertion Sort (next section) outperforms Merge/Quick Sort due to lower constant overhead โ no recursion, no auxiliary array, cache-friendly sequential access.
This is why TimSort and Introsort switch to insertion sort for small partitions.
04
Section Four ยท Insertion Sort
Insertion Sort โ The Nearly-Sorted Champion
Insertion Sort builds the sorted portion one element at a time โ pick the next unsorted element and insert it into the correct position by shifting larger elements right. It's O(nยฒ) worst case but O(n) on already-sorted or nearly-sorted data, making it the optimal algorithm for small arrays and the finishing step inside hybrid sorts like TimSort. Every hand of cards you've ever sorted was insertion sort โ pick up a card, slide it into position among the cards you already hold.
Pseudocode
functioninsertionSort(arr):
for i = 1to n - 1:
key = arr[i]
j = i - 1while j >= 0and arr[j] > key:
arr[j + 1] = arr[j] // shift right
j = j - 1
arr[j + 1] = key // insert at correct position// O(nยฒ) worst, O(n) best (sorted)
Insertion Sort โ inserting 3 into the sorted portion [1, 4, 7]
Insertion Sort is O(n) on nearly-sorted data:
If each element is at most k positions from its final location, insertion sort runs in O(nk). For k = O(1), it's O(n).
This is why it's used as the base case in TimSort and Introsort โ small subarrays are nearly sorted after partitioning.
05
Section Five ยท Merge Sort
Merge Sort โ Guaranteed O(n log n)
Merge Sort is the divide-and-conquer sorting algorithm: split the array in half, recursively sort each half, then merge the two sorted halves in O(n). It gives a guaranteed O(n log n) worst case โ no bad inputs can make it degrade. The merge step is the key insight: merging two sorted arrays into one sorted array costs O(n) using two pointers. The trade-off is O(n) auxiliary space for the merge buffer. Merge Sort is stable (equal elements maintain relative order) and is the foundation of Java's Arrays.sort() for objects (TimSort), external sorting (sorting data too large for RAM), and the "count inversions" interview pattern.
Pseudocode
functionmergeSort(arr, left, right):
if left >= right: return// base case: 0โ1 elements
mid = (left + right) / 2mergeSort(arr, left, mid) // sort left halfmergeSort(arr, mid + 1, right) // sort right halfmerge(arr, left, mid, right) // merge two sorted halvesfunctionmerge(arr, left, mid, right):
temp = new array[right - left + 1]
i = left, j = mid + 1, k = 0while i <= mid and j <= right:
if arr[i] <= arr[j]: // <= ensures stability
temp[k++] = arr[i++]
else:
temp[k++] = arr[j++]
copy remaining from left or right
copy temp back to arr[left..right] // O(n) merge// Total: O(n log n), space O(n)
Merge Sort โ split tree and merge-back
Merge Sort's merge step is the basis of "merge two sorted arrays/lists":
The merge operation appears independently as an interview problem (LC 88 "Merge Sorted Array," LC 21 "Merge Two Sorted Lists") and inside other algorithms (external sort, count inversions).
Master the two-pointer merge โ it appears everywhere.
06
Section Six ยท Quick Sort
Quick Sort โ Fast Average, Dangerous Worst
Quick Sort picks a pivot element, partitions the array into elements less than and greater than the pivot, then recursively sorts each partition. Average case O(n log n) with excellent cache performance โ in practice, often faster than Merge Sort despite the same asymptotic complexity. The danger: if the pivot is always the smallest or largest element (e.g., first element on already-sorted input), partitions are maximally unbalanced โ O(nยฒ). Mitigation: random pivot selection or median-of-three. Quick Sort is in-place (O(log n) stack space) and is the basis of Java's Arrays.sort() for primitives (Dual-Pivot Quick Sort). Not stable.
Pseudocode
functionquickSort(arr, low, high):
if low >= high: return
pivotIdx = partition(arr, low, high)
quickSort(arr, low, pivotIdx - 1)
quickSort(arr, pivotIdx + 1, high)
functionpartition(arr, low, high): // Lomuto partition
pivot = arr[high] // pivot = last element
i = low - 1// boundary of "โค pivot" zonefor j = low to high - 1:
if arr[j] <= pivot:
i = i + 1swap(arr[i], arr[j]) // expand "โค pivot" zoneswap(arr[i + 1], arr[high]) // place pivot in final positionreturn i + 1// pivot index
Partition step โ pivot = 4, elements rearranged around it
Common Mistake: Using the first element as pivot on already-sorted input. Every partition puts all elements on one side โ depth becomes n instead of log n โ O(nยฒ). Fix: pick a random pivot (swap(arr[low + rand(high-low)], arr[high])) or use median-of-three (median of first, middle, last elements).
07
Section Seven ยท Comparison
Sorting Algorithms โ Head to Head
Algorithm
Best
Average
Worst
Space
Stable
Notes
Bubble Sort
O(n)
O(nยฒ)
O(nยฒ)
O(1)
โ
Teaching only
Selection Sort
O(nยฒ)
O(nยฒ)
O(nยฒ)
O(1)
โ
Min swaps (n-1)
Insertion Sort
O(n)
O(nยฒ)
O(nยฒ)
O(1)
โ
Best for small/nearly-sorted
Merge Sort
O(n log n)
O(n log n)
O(n log n)
O(n)
โ
Guaranteed performance, stable
Quick Sort
O(n log n)
O(n log n)
O(nยฒ)
O(log n)
โ
Fastest in practice, in-place
Heap Sort
O(n log n)
O(n log n)
O(n log n)
O(1)
โ
In-place + guaranteed, poor cache
Counting Sort
O(n+k)
O(n+k)
O(n+k)
O(k)
โ
Integers only, k = value range
Radix Sort
O(dยท(n+k))
O(dยท(n+k))
O(dยท(n+k))
O(n+k)
โ
d = digits, k = base
Choose Merge Sort when
Guaranteed O(n log n) + Stability
Sorting linked lists (no random access needed)
Stability required (sort by multiple keys)
Worst-case guarantee matters
External sorting (disk-based)
Choose Quick Sort when
In-place + Fast Average
Array sorting with tight memory
Average-case speed matters most
In-place requirement (O(1) extra)
Cache performance matters (sequential access)
08
Section Eight ยท Implementation
Build It Once
Build Merge Sort and Quick Sort from scratch once โ it makes the mechanics concrete. In any real interview, call Arrays.sort() and focus on the actual problem.
Java โ Merge Sort & Quick Sort core mechanics
// โโ MERGE SORT โโโโโโโโโโโโโโโโโโโโโโโโโโโ O(n log n), O(n) spacevoidmergeSort(int[] arr, int l, int r) {
if (l >= r) return;
int m = l + (r - l) / 2;
mergeSort(arr, l, m);
mergeSort(arr, m + 1, r);
merge(arr, l, m, r);
}
voidmerge(int[] arr, int l, int m, int r) {
int[] tmp = newint[r - l + 1];
int i = l, j = m + 1, k = 0;
while (i <= m && j <= r)
tmp[k++] = arr[i] <= arr[j] ? arr[i++] : arr[j++];
while (i <= m) tmp[k++] = arr[i++];
while (j <= r) tmp[k++] = arr[j++];
System.arraycopy(tmp, 0, arr, l, tmp.length);
}
// โโ QUICK SORT โโโโโโโโโโโโโโโโโโโโโโโโโโโ O(n log n) avg, O(log n) spacevoidquickSort(int[] arr, int lo, int hi) {
if (lo >= hi) return;
int p = partition(arr, lo, hi);
quickSort(arr, lo, p - 1);
quickSort(arr, p + 1, hi);
}
intpartition(int[] arr, int lo, int hi) {
int pivot = arr[hi], i = lo - 1;
for (int j = lo; j < hi; j++)
if (arr[j] <= pivot)
swap(arr, ++i, j);
swap(arr, i + 1, hi);
return i + 1;
}
voidswap(int[] a, int i, int j) {
int t = a[i]; a[i] = a[j]; a[j] = t;
}
Sorting โ Full Implementation
Java โ All sorting implementations
import java.util.*;
public classSorting {
// โโโ 1. Insertion Sort โโโโโโโโโโโโโโโโโโโ O(nยฒ), O(1), stablestatic voidinsertionSort(int[] arr) {
for (int i = 1; i < arr.length; i++) {
int key = arr[i], j = i - 1;
while (j >= 0 && arr[j] > key) {
arr[j + 1] = arr[j];
j--;
}
arr[j + 1] = key;
}
}
// โโโ 2. Merge Sort โโโโโโโโโโโโโโโโโโโโโโโ O(n log n), O(n), stablestatic voidmergeSort(int[] arr, int l, int r) {
if (l >= r) return;
int m = l + (r - l) / 2;
mergeSort(arr, l, m);
mergeSort(arr, m + 1, r);
merge(arr, l, m, r);
}
static voidmerge(int[] arr, int l, int m, int r) {
int[] tmp = newint[r - l + 1];
int i = l, j = m + 1, k = 0;
while (i <= m && j <= r)
tmp[k++] = arr[i] <= arr[j] ? arr[i++] : arr[j++];
while (i <= m) tmp[k++] = arr[i++];
while (j <= r) tmp[k++] = arr[j++];
System.arraycopy(tmp, 0, arr, l, tmp.length);
}
// โโโ 3. Quick Sort (random pivot) โโโโโโโโ O(n log n) avg, O(log n)static Random rand = new Random();
static voidquickSort(int[] arr, int lo, int hi) {
if (lo >= hi) return;
// Random pivot to avoid O(nยฒ) on sorted inputswap(arr, lo + rand.nextInt(hi - lo + 1), hi);
int p = partition(arr, lo, hi);
quickSort(arr, lo, p - 1);
quickSort(arr, p + 1, hi);
}
staticintpartition(int[] arr, int lo, int hi) {
int pivot = arr[hi], i = lo - 1;
for (int j = lo; j < hi; j++)
if (arr[j] <= pivot) swap(arr, ++i, j);
swap(arr, i + 1, hi);
return i + 1;
}
// โโโ 4. Quick Select (kth smallest) โโโโโโ O(n) avgstaticintquickSelect(int[] arr, int lo, int hi, int k) {
if (lo == hi) return arr[lo];
swap(arr, lo + rand.nextInt(hi - lo + 1), hi);
int p = partition(arr, lo, hi);
if (k == p) return arr[p];
return k < p ? quickSelect(arr, lo, p - 1, k)
: quickSelect(arr, p + 1, hi, k);
}
// โโโ 5. Counting Sort โโโโโโโโโโโโโโโโโโโโ O(n+k), O(k)staticint[] countingSort(int[] arr, int maxVal) {
int[] count = newint[maxVal + 1];
for (int x : arr) count[x]++;
int idx = 0;
for (int v = 0; v <= maxVal; v++)
while (count[v]-- > 0) arr[idx++] = v;
return arr;
}
static voidswap(int[] a, int i, int j) {
int t = a[i]; a[i] = a[j]; a[j] = t;
}
// โโโ Main โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโpublic static voidmain(String[] args) {
int[] a1 = {7,4,6,2,3,1,5};
mergeSort(a1, 0, a1.length - 1);
System.out.println("Merge: " + Arrays.toString(a1));
int[] a2 = {7,4,6,2,3,1,5};
quickSort(a2, 0, a2.length - 1);
System.out.println("Quick: " + Arrays.toString(a2));
int[] a3 = {7,4,6,2,3,1,5};
insertionSort(a3);
System.out.println("Insert: " + Arrays.toString(a3));
int[] a4 = {7,4,6,2,3,1,5};
System.out.println("3rd smallest: " + quickSelect(a4, 0, a4.length-1, 2));
}
}
Python โ Sorting implementations
import random
# โโโ 1. Insertion Sort โโโโโโโโโโโโโโโโโโโ O(nยฒ), stabledefinsertion_sort(arr):
for i in range(1, len(arr)):
key, j = arr[i], i - 1
while j >= 0 and arr[j] > key:
arr[j + 1] = arr[j]
j -= 1
arr[j + 1] = key
# โโโ 2. Merge Sort โโโโโโโโโโโโโโโโโโโโโโโ O(n log n), stabledefmerge_sort(arr):
if len(arr) <= 1:
return arr
mid = len(arr) // 2
left = merge_sort(arr[:mid])
right = merge_sort(arr[mid:])
returnmerge(left, right)
defmerge(a, b):
result, i, j = [], 0, 0
while i < len(a) and j < len(b):
if a[i] <= b[j]:
result.append(a[i]); i += 1
else:
result.append(b[j]); j += 1
result.extend(a[i:])
result.extend(b[j:])
return result
# โโโ 3. Quick Sort (random pivot) โโโโโโโโ O(n log n) avgdefquick_sort(arr, lo, hi):
if lo >= hi: return
pi = random.randint(lo, hi)
arr[pi], arr[hi] = arr[hi], arr[pi]
p = partition(arr, lo, hi)
quick_sort(arr, lo, p - 1)
quick_sort(arr, p + 1, hi)
defpartition(arr, lo, hi):
pivot, i = arr[hi], lo - 1
for j in range(lo, hi):
if arr[j] <= pivot:
i += 1
arr[i], arr[j] = arr[j], arr[i]
arr[i + 1], arr[hi] = arr[hi], arr[i + 1]
return i + 1
# โโโ 4. Counting Sort โโโโโโโโโโโโโโโโโโโ O(n+k)defcounting_sort(arr, max_val):
count = [0] * (max_val + 1)
for x in arr: count[x] += 1
idx = 0
for v in range(max_val + 1):
while count[v] > 0:
arr[idx] = v; idx += 1; count[v] -= 1
return arr
# Demo
a = [7,4,6,2,3,1,5]
print("Merge:", merge_sort(a[:]))
b = a[:]
quick_sort(b, 0, len(b)-1)
print("Quick:", b)
c = a[:]
insertion_sort(c)
print("Insert:", c)
// Sort int array ascending
Arrays.sort(arr);
// Sort int array descending โ no direct way, box firstInteger[] boxed = Arrays.stream(arr).boxed().toArray(Integer[]::new);
Arrays.sort(boxed, Collections.reverseOrder());
// Sort 2D array by first element (e.g., intervals)
Arrays.sort(intervals, (a, b) -> a[0] - b[0]);
// Sort by custom comparator
Arrays.sort(arr, (a, b) -> a.length() - b.length()); // by string length
Arrays.sort(arr, Comparator.comparingInt(String::length));
// Sort a range only
Arrays.sort(arr, fromIndex, toIndex);
// Partial sort โ kth smallest using PriorityQueuePriorityQueue<Integer> pq = newPriorityQueue<>(Collections.reverseOrder());
for (int x : arr) {
pq.offer(x);
if (pq.size() > k) pq.poll(); // O(n log k)
}
int kthSmallest = pq.peek();
Method
Algorithm
Stable
Note
Arrays.sort(int[])
Dual-Pivot Quick Sort
โ
Primitives โ fastest for int[], long[]
Arrays.sort(Object[])
TimSort
โ
Objects โ merge sort + insertion sort hybrid
Collections.sort(List)
TimSort
โ
Delegates to List.sort()
Arrays.sort(T[], Comparator)
TimSort
โ
Custom ordering โ lambda-friendly
โ Gotcha:Arrays.sort(int[]) is NOT stable (Quick Sort). Arrays.sort(Integer[]) IS stable (TimSort). If you need stable sorting of primitives, box them to Integer[] first โ this matters when sorting intervals where ties in one key must preserve order from a previous sort.
โ Gotcha: Never use (a, b) -> a - b as a comparator for large integers โ it overflows when a is large positive and b is large negative (or vice versa). Use Integer.compare(a, b) instead. For 2D arrays: (a, b) -> Integer.compare(a[0], b[0]).
10
Section Ten ยท Practice
Problems To Solve
Most interview problems don't ask you to implement a sorting algorithm โ they ask you to sort first, then solve the real problem using the sorted order. "Merge Intervals" sorts by start time then merges overlapping intervals. "Meeting Rooms" sorts by start time to find conflicts. "Kth Largest Element" uses Quick Select or a heap. The sorting step is O(n log n) preprocessing that enables an O(n) solution. Recognise the signal: when order doesn't matter in the input but would simplify the logic, sort first.
Sort intervals by start time. Scan left to right, merging overlapping intervals: if current.start โค previous.end, extend; otherwise start a new interval. O(n log n) sort + O(n) scan.
Sort the array. For each element, use two pointers on the remaining portion to find pairs summing to the target complement. Sorting enables skipping duplicates and efficient pair-finding.
Quick Select finds the kth element in O(n) average without fully sorting. Alternatively, use a min-heap of size k for O(n log k). Both are preferred over full sorting O(n log n).
Sort by end time. Greedily pick intervals that end earliest, skipping those that overlap with the last selected. Classic activity selection problem.
Medium
sort
Sort Colors โ LC 75
Dutch National Flag problem: three-way partition in a single pass O(n) using three pointers. A counting sort also works in two passes.
Hard
merge sort
Count of Smaller Numbers After Self โ LC 315
Modified merge sort: during the merge step, count how many elements from the right half are merged before each element from the left half โ these are the smaller numbers after self.
Interview Pattern:
When the input is unsorted and the problem involves comparisons, ranges, or proximity โ sort first.
The O(n log n) sorting step is almost never the bottleneck, and the resulting ordered data simplifies the remaining logic from O(nยฒ) to O(n).
In Java, Arrays.sort() is one line โ use it freely and focus on the actual problem logic.