Watch sorting algorithms work step by step. Compare performance, understand time complexity, and see how different strategies tackle the same problem.
Repeatedly steps through the list, compares adjacent elements and swaps them if they are in the wrong order. The pass through the list is repeated until sorted. Named because smaller elements “bubble” to the top.
When to use: Educational purposes. Useful for nearly-sorted data where it approaches O(n).
| Algorithm | Best | Average | Worst | Space |
|---|---|---|---|---|
| Bubble Sort | O(n) | O(n²) | O(n²) | O(1) |
| Selection Sort | O(n²) | O(n²) | O(n²) | O(1) |
| Insertion Sort | O(n) | O(n²) | O(n²) | O(1) |
| Merge Sort | O(n log n) | O(n log n) | O(n log n) | O(n) |
| Quick Sort | O(n log n) | O(n log n) | O(n²) | O(log n) |
| Heap Sort | O(n log n) | O(n log n) | O(n log n) | O(1) |
| Shell Sort | O(n log n) | O(n^(3/2)) | O(n²) | O(1) |
| Counting Sort | O(n+k) | O(n+k) | O(n+k) | O(k) |
Repeatedly steps through the list, compares adjacent elements and swaps them if they are in the wrong order. The pass through the list is repeated until sorted. Named because smaller elements “bubble” to the top.
Finds the minimum element from the unsorted portion and places it at the beginning. It divides the array into sorted and unsorted regions, growing the sorted region one element at a time.
Builds the sorted array one element at a time by repeatedly picking the next element and inserting it into the correct position among the already-sorted elements. Works like sorting playing cards in your hand.
A divide-and-conquer algorithm that splits the array in half, recursively sorts each half, then merges the two sorted halves together. Guaranteed O(n log n) performance regardless of input.
Picks a pivot element and partitions the array so elements less than the pivot come before it and greater elements come after. Recursively sorts the partitions. In practice, the fastest general-purpose sort.
Builds a max-heap from the array, then repeatedly extracts the maximum element and places it at the end. Combines the best of merge sort (guaranteed O(n log n)) with selection sort (in-place).
A generalization of insertion sort that allows exchange of far-apart elements. Uses a decreasing gap sequence to sort interleaved sublists, progressively reducing the gap until a final insertion sort pass.
A non-comparison sort that counts the frequency of each distinct value, then uses arithmetic to place each element in its correct position. Extremely fast when the range of values (k) is small relative to n.
Merge Sort and Quick Sort exemplify the divide-and-conquer paradigm: break a problem into smaller subproblems, solve them recursively, then combine solutions. This same strategy appears in binary search, the FFT algorithm, and computational geometry.
Bubble, Selection, Insertion, Heap, and Shell Sort are in-place (O(1) extra space). Merge Sort needs O(n) auxiliary space. Counting Sort needs O(k) space for the count array. This tradeoff between time and space is a recurring theme in algorithm design.
A stable sort preserves the relative order of equal elements. Bubble, Insertion, Merge, and Counting Sort are stable. Selection, Quick, Heap, and Shell Sort are not. Stability matters when sorting by multiple keys — for example, sorting a spreadsheet first by name, then by date.
Comparison-based sorts cannot do better than O(n log n) in the worst case — this is a proven mathematical lower bound. Counting Sort bypasses this by not comparing elements directly, instead using element values as array indices. This is why it achieves O(n+k) time.
Sorting algorithms are the building blocks of computational thinking — simple rules creating efficient order from chaos. Tactiko applies the same principle: basic player moves generate complex football tactics, just like how Quick Sort's simple partitioning creates elegant efficiency.
Play Tactiko