Algorithms

Sorting Algorithms Ordering

Sorting is the process of rearranging elements in a defined order โ€” the bedrock of countless algorithms that rely on sorted input (binary search, two-pointer, merge intervals). Knowing which sort to apply and why is one of the most-tested skills in technical interviews.

01
Section One ยท Foundation

What is Sorting?

unsorted โ†’ sorted โ†‘

A sorting algorithm rearranges the elements of a collection into a total order โ€” typically ascending or descending. Sorting is foundational: binary search, merge-interval problems, frequency counting, and k-th element queries all assume sorted input. Algorithms fall into two broad families: comparison sorts (Merge, Quick, Heap โ€” bounded by ฮฉ(n log n)) and non-comparison sorts (Counting, Radix, Bucket โ€” can achieve O(n) under constraints). Choosing the right sort depends on input size, pre-sortedness, memory budget, and stability requirements.

Analogy: Think of sorting a hand of playing cards. Insertion sort copies how most people pick up and place cards one-by-one. Merge sort is like splitting the deck in half repeatedly until each pile is a single card, then merging pairs in order. Quick sort is like picking a "pivot" card, putting everything smaller to the left and larger to the right, then recursing on each pile.
Key Characteristics
Stable vs Unstable
โ–ถ
Stable sorts preserve relative order of equal elements (Merge, Insertion, Counting). Quick and Heap are unstable.
In-Place vs Out-of-Place
โ–ถ
In-place sorts use O(1) extra space (Quick, Heap). Merge sort needs O(n) auxiliary space.
Comparison Lower Bound
โ–ถ
Any comparison-based sort requires at least ฮฉ(n log n) comparisons in the worst case โ€” proved via decision tree argument.
Adaptive
โ–ถ
Adaptive sorts exploit existing order (Timsort, Insertion on nearly-sorted input) to run faster than their worst-case bound.
Algorithm Landscape
AlgorithmFamilyStableIn-PlaceBestAverageWorst
Merge SortComparisonโœ“โœ—O(n log n)O(n log n)O(n log n)
Quick SortComparisonโœ—โœ“O(n log n)O(n log n)O(nยฒ)
Heap SortComparisonโœ—โœ“O(n log n)O(n log n)O(n log n)
Insertion SortComparisonโœ“โœ“O(n)O(nยฒ)O(nยฒ)
Radix SortNon-Comparisonโœ“โœ—O(nk)O(nk)O(nk)
Counting SortNon-Comparisonโœ“โœ—O(n+k)O(n+k)O(n+k)
02
Section Two ยท Core Algorithms

Algorithm Walkthroughs

Merge Sort โ€” Divide & Conquer
Recursively split the array in half, sort each half, then merge the two sorted halves. Guaranteed O(n log n) on all inputs.
StepActionState After
1Split array at mid-point into left and right halves[5,3,8,1] โ†’ [5,3] and [8,1]
2Recursively sort each half until subarrays of size 1[3,5] and [1,8]
3Merge: compare front elements of both halves, pick smaller, advance pointer[1,3,5,8]
Tip: Merge sort is the go-to for sorting linked lists (no random access needed) and for external sorting (files too big for RAM). Its stable sort property also makes it useful when secondary ordering matters.
Quick Sort โ€” Partition & Recurse
Choose a pivot, partition elements around it so all elements left < pivot < all elements right, then recurse on each partition.
StepActionState After
1Choose pivot (e.g. last element). Set left pointer at start.pivot=4, arr=[3,6,8,10,1,2,4]
2Walk j from lo to hi-1: if arr[j] โ‰ค pivot, swap arr[++i] with arr[j][3,2,1,4,10,6,8] (pivot at index 3)
3Recurse on [0..pivot-1] and [pivot+1..n-1]Each partition converges to sorted order
Tip: Use random pivot selection or median-of-three to avoid O(nยฒ) on already-sorted input. Java's Arrays.sort uses a dual-pivot quicksort for primitives.
Heap Sort โ€” Build Heap + Extract
Build a max-heap in O(n), then repeatedly extract the maximum to sort in-place in O(n log n) guaranteed, with O(1) extra space.
StepActionState After
1Heapify (build max-heap): call sift-down from last internal node to rootArray satisfies heap property
2Swap root (max) with last element; reduce heap size by 1Largest element placed at end
3Sift-down root to restore heap; repeat steps 2โ€“3 until one element remainsArray fully sorted ascending
Tip: Heap sort excels when you need worst-case O(n log n) with O(1) space and don't care about stability. It's less cache-friendly than merge/quick sort because heap traversal accesses non-contiguous memory.
Radix Sort โ€” Non-Comparison Linear Sort
Sort integers digit-by-digit from least significant digit (LSD) to most significant, using a stable sub-sort (counting sort) at each digit position.
StepActionState After
1Find max value to determine number of digit passes dd = number of passes needed
2For each digit position (1s, 10s, 100sโ€ฆ): stable counting-sort on that digitArray partially ordered through current digit
3After all d passes, array is fully sortedO(dยทn) total, O(n+k) space
Tip: Radix sort beats comparison sorts when keys are integers with bounded digit count d โ‰ช log n (e.g., 32-bit ints โ†’ d=10 decimal digits). Not suitable for floating-point or arbitrary objects.
Common Mistake: Using quick sort on already-sorted input with a fixed pivot (first or last element) causes O(nยฒ) performance. Always use random pivot or median-of-three. Also, forgetting that merge sort needs O(n) auxiliary space can cause OOM in memory-constrained environments.
03
Section Three ยท Visuals

Diagrams & Walkthroughs

Merge Sort โ€” Divide & Merge Tree
divide โ†“ (log n levels) [38, 27, 43, 3, 9, 82, 10] [38, 27, 43] [3, 9, 82, 10] [38] [27,43] [3,9] [82,10] โ†‘ merge (stable, pick smaller front element) [27,38,43] [3,9,10,82] [3,9,10,27,38,43,82] โœ“
Quick Sort โ€” Partition Walkthrough
Input: pivot = 4 (last element highlighted) 3 6 8 1 2 4 pivot โ†“ partition After partition โ€” pivot at its final sorted position: 3 2 1 4 6 8 < 4 โ†’ recurse fixed > 4 โ†’ recurse
Complexity Curves Comparison
time input n โ†’ O(nยฒ) โ€” Bubble/Insertion/Selection O(n log n) โ€” Merge / Quick / Heap O(n) โ€” Counting / Radix
04
Section Four ยท Code

Implementations

Quick-reference Java merge sort โ€” click the button for full multi-language implementations.

// Merge Sort โ€” Java void mergeSort(int[] arr, int l, int r) { if (l >= r) return; int mid = l + (r - l) / 2; mergeSort(arr, l, mid); mergeSort(arr, mid + 1, r); merge(arr, l, mid, r); // stable O(n) merge }
05
Section Five ยท Complexity

Time & Space Complexity

AlgorithmBestAverageWorstSpaceStable
Merge SortO(n log n)O(n log n)O(n log n)O(n)โœ“
Quick SortO(n log n)O(n log n)O(nยฒ)*O(log n)โœ—
Heap SortO(n log n)O(n log n)O(n log n)O(1)โœ—
Insertion SortO(n)O(nยฒ)O(nยฒ)O(1)โœ“
Radix SortO(nk)O(nk)O(nk)O(n+k)โœ“
Counting SortO(n+k)O(n+k)O(n+k)O(k)โœ“
* Quick Sort worst case: occurs with a bad pivot choice on already-sorted or reverse-sorted input (e.g., always picking the last element as pivot). Random pivot selection reduces this to O(n log n) with very high probability. Java's dual-pivot quicksort in Arrays.sort achieves O(n log n) practically with O(log n) stack space.
Space Complexity Note

Merge sort requires O(n) auxiliary space for the merge buffer. Quick sort uses O(log n) stack space for recursion. Heap sort is the only comparison sort achieving O(n log n) worst-case with O(1) extra space. Radix sort trades time linearity for O(n+k) auxiliary space (output array + counting buckets).

06
Section Six ยท Decision Guide

When to Use Sorting

Use Merge Sort When
  • You need a stable sort guaranteed O(n log n)
  • Sorting linked lists (no random-access needed)
  • External sorting (data larger than RAM)
  • Counting inversions via the merge step
Use Quick Sort When
  • In-practice fastest for arrays of primitives
  • Memory is constrained โ€” only O(log n) stack space
  • Stability not required and average case is fine
Use Heap Sort When
  • Need worst-case O(n log n) + O(1) space
  • Partial sort: finding top-K elements
  • Real-time systems needing predictable performance
Use Radix/Counting Sort When
  • Keys are integers with bounded range (k โ‰ช n)
  • You need linear O(n) time instead of O(n log n)
  • Strings with fixed-length keys
Algorithm Decision Table
ScenarioRecommendedWhy
General-purpose array sortQuick Sort (random pivot)Best cache performance, O(n log n) average
Need stable sortMerge Sort / TimsortStable, O(n log n) guaranteed
Memory critical, no stabilityHeap SortO(1) space, O(n log n) worst
Nearly-sorted input (< 10% disorder)Insertion Sort / TimsortO(n) on almost-sorted data
Integer keys, small range k < 100kCounting SortO(n+k) linear
Integer keys, large but boundedRadix SortO(nk) linear in digit count
Java production codeArrays.sort / Collections.sortDual-pivot quicksort (primitives), Timsort (objects)
โœ— Never use bubble or selection sort in production or interviews unless the problem explicitly asks to implement a naive O(nยฒ) sort for educational purposes.
07
Section Seven ยท Interview Practice

LeetCode Problems

Sorting problems in interviews test custom comparators, divide-and-conquer thinking, and recognising when sorting unlocks a two-pointer or greedy solution. Many "hard" problems become medium after sorting the input.

  • Easy#912 โ€” Sort an Array โ€” Implement merge sort or heap sort from scratch without using built-in sort.
  • Easy#88 โ€” Merge Sorted Array โ€” Merge two sorted arrays in-place from the back to avoid O(n) shifts.
  • Medium#56 โ€” Merge Intervals โ€” Sort by start time, then greedily merge overlapping intervals.
  • Medium#75 โ€” Sort Colors โ€” Dutch National Flag: 3-way partition in O(n) time, O(1) space.
  • Medium#179 โ€” Largest Number โ€” Custom comparator: compare ab vs ba as strings to build the largest concatenation.
  • Medium#148 โ€” Sort List โ€” Merge sort on a linked list; O(n log n) time, O(log n) stack space.
  • Medium#324 โ€” Wiggle Sort II โ€” Sort then interleave larger elements at odd indices, smaller at even.
  • Hard#315 โ€” Count of Smaller Numbers After Self โ€” Modified merge sort counts inversions during the merge step.
  • Hard#493 โ€” Reverse Pairs โ€” Merge sort variant: count pairs where arr[i] > 2 * arr[j] during merge.
  • Hard#4 โ€” Median of Two Sorted Arrays โ€” Binary search on partition: O(log(min(m,n))) โ€” no sort needed.
Interview Pattern: When you see "find pairs", "count inversions", or "merge intervals" โ€” think sorting first. A custom Comparator or sorting key often reduces an O(nยฒ) brute force to O(n log n). Always ask: "would sorting the input first make this problem easier?"