Decrease &
Conquer
Insertion Sort
Graph Searching
Selection & Median
Binary Search Trees
Insertion Sort Example
5
2
4
6
1
3
2
5
4
6
1
3
2
4
5
6
1
3
2
4
5
6
1
3
1
2
4
5
6
3
1
2
3
4
5
6
done
The operation of Insertion-Sort on the array A = [5, 2, 4, 6, 1, 3]. The
position of index j is indicated by shading.
Insertion Sort
Given: Let A[1, …, n] be the array to be sorted.
1. for j ← 2 to |A|
2.
do
► insert A[j] into sorted A[1, …, j − 1]
3.
key ← A[j]
► save value of A[j]
4.
i←j−1
► look for proper position to the left
5.
while i > 0 and A[i] > key
6.
do
► original A[j] < than current A[i]
7.
A[i + 1] ← A[i] ► move A[i] to the right
8.
i←i−1
► move index to the left
9.
A[i + 1] ← key ► insert original A[j] in proper position
Correctness of Insertion Sort
Claim:
At the end A[1, …, n] is sorted.
Proof:
Show A[1, …, j] is sorted, by induction on j.
Basis:
j = 1. A[1] is trivially sorted.
Induction: Assume A[1, …, j − 1] is sorted by induction
hypothesis. Lines (6) – (8) move these elements over, so
they don’t get out of order. Line (9) inserts A[j] into this
slightly expanded array when line (5) fails, implying A[j] <
A[i + 1, …, j − 1] and either i = 0 or A[i] ≤ A[j]. Both imply
the new A[1, …, j] is sorted.
1
i i+1
j
····
····
····
····
See that A[i] ≤ A[j] < A[i + 1]
Complexity of Insertion Sort
Count the number of lines executed, bounded above.
Line (1) is executed n times (final test takes an extra step).
Lines (2) – (4); (9) at most n times.
For each j lines (5) – (8) are executed at most j times so that
total here is less than n n times.
We are not interested in constants of proportionality, and we
are usually only interested in the dominant number of steps
as n becomes large.
t(n) ≤ 4n2 + 5n + 2
lines 5 - 8 lines 1,2 - 4,9
the time it takes to start
and stop the algorithm
We say t(n) is quadratic because it grows proportional to n2.
BFS idea
Enqueue start vertex
repeat
u ← Out(Q)
put all unvisited neighbors of u into Q
until Q is empty
unvisited
vertices
In
v
queue
Out
u
ordered (partially) by distance (spatial)
visited
vertices
BFS code
Idea: move away from initial vertex slowly, processing
neighbors in parallel, using a queue (with distance)
BFS(v)
visit(v)
for all neighbors u of v
if u is unvisited, then
[mark edge(v, u)]
BFS(u)
[let d(u) = d(v) + 1]
{set d(v) = 0 initially}
{mark node}
{all at once}
{in some particular order}
{to create spanning tree}
{in parallel}
{to calculate distance}
BFS example
DFS idea
Push
v
unvisited
vertices
Pop
u
s
t
a
c
k
all
done
Push start vertex:
repeat
u ← Top(S)
push any unvisited neighbor of u
otherwise pop(S)
until S is empty
visited
vertices
DFS code
Idea: move away from initial vertex quickly, processing
neighbors sequentially, using a stack (and global time)
DFS(v)
visit(v)
[let s(v) = t = t + 1]
for each neighbor u of v
if u is unvisited, then
[mark edge (v, u)]
DFS(u)
[let f(v) = t = t + 1]
{set s(v) = f(v) = 0 initially}
{mark node}
{start time}
{one at a time}
{in some particular order}
{of spanning tree}
{recursively}
{finish time}
Fake-coin puzzle
Problem: Determine one fake (lighter) coin among n
identically looking coins using a balance scale with no
weights.
Algorithm: (decrease-by-half)
If n is even (odd): divide into two equal piles (with one
coin left over), compare weights, and repeat on lighter
pile (if they are equal, the odd coin out is the lighter one).
Question: Is there a better algorithm?
Answer: Yes, decrease by a third!
How: Assigned exercise
Russian peasant multiplication
Problem: Compute the product of two integers
Approach: decrease-by-half :
n/
n×m=
2
n-1/
Example: n
12
6
3
1
n ×
× 2m
2
m
25
50
100
200
m =
× 2m + m
if n is even
if n is odd
add m of odd n
100
+200
300
Josephus Problem
Problem: n people stand in a circle. Starting with the first
person, every other person is eliminated until just one person
is left. Find J(n), the original position of the remaining person.
61
12
5
41
21
32
11
7
61
52
21
32
41
2J(n/2)−1
if n is even
2J(n-1/2 )+1
if n is odd
J(n) =
Key: The subscript is the pass on which the person is
eliminated.
Theorem: J(n) in binary can be obtained by a one-bit left cyclic
shift of n’s binary representation.
Selection
Definition: The lth order statistic of {a1, …, an} is the lth
smallest element a, such that l = |{i : ai ≤ a}|.
Example: max is l = n; min is l = 1; median is l = n/2
Selection: Given {a1, …, an} (all distinct) and 1 ≤ l ≤ n.
Find the lth order statistic.
Approaches: Sort and choose lth element. (But max and
min have O(n) algorithms).
QuickSelect: split about a pivot and call recursively
i
if l < k
i
j
· · · · · · · ·
pivot
↓
j
<x
x
>x
select
recursively
k
select
recursively
if l > k
Selection algorithm
QuickSelect(A, i, j, l) ► finds (l − i + 1) order statistic of A[i .. j]
if i = j (= l) then return A[i]
► it’s a one element array
else k ← SPLIT(A, i, j)
► split around random point
if l < k then return QuickSelect(A, i, k − 1, l)
if l = k then return A[k]
► found it
if l > k then return QuickSelect(A, k + 1, j, l)
Note: l is an absolute index, k is side-effecting
Complexity: O(n) expected (average-case)
O(n2) worst-case
Median of Medians algorithm
Idea: Guarantee that the pivot is “near” the middle.
SELECT:
1. Group the array elements into n/5 groups of fives
and find the median of each group:
2. Call SELECT recursively on the set of medians to find
the “median of the medians”, x*
3. Call SPLIT, except make it pivot around x*
4. Do recursive call to SELECT same as before.
>
>
5
x
<
<
Median of Medians analysis
>
for illustration
purposes, put
in order by
medians
1. O(n)
2. T(n/5)
3. O(n)
4. T(3n/4)
>
>
x*
<
<
<
worst-case call
in step 4 is 3/4n
(actually 7/10n)
for sufficiently
large n. (44?)
n
3/4n + n/5 = 19/20n
.
.
.
.
+
< 20n O(n)
T(n) = T(3n/4) + T(n/5) + O(n)
Basic Operations
To search for a value: (time is O(h), h = height of tree)
TREE-SEARCH(t, k) ► returns pointer to node with value k
(1) if t = nil or k = key[t] then return t
► “returns root”
(2) if k < key[t] then return TREE-SEARCH(left[t],k)
(3) if k > key[t] then return TREE-SEARCH(right[t],k)
To compute Minimum or Maximum values in a search tree
(no comparisons required) is O(h) time [Exercise 7]
TREE-MINIMUM(x)
(1) while left[x] ≠ nil
(2) do x ← left[x]
(3) return x
TREE-MAXIMUM(x)
(1) while right[x] ≠ nil
(2) do x ← right[x]
(3) return x
Successor & Predecessor
Compute values without comparisons in O(h) time.
TREE-SUCCESSOR(x)
► returns nil if x is maximal
1. if right[x] ≠ nil
► x doesn’t have a right child
2. then return TREE-MINIMUM(right[x])
3. repeat
► go up until you go right
4.
y←x
► save value of x
5.
x ← parent[x]
► go up
6. until x = nil or left[x] = y
► reach root or successor
7. return x
►
TREE-PREDECESSOR is done similarly
Insertion of Keys
TREE-INSERT(T, k)
► inserts key k into tree T
1. if T = nil then
► T is empty
2.
T ← newnode
► make a new node (tree)
3.
key[T] ← k
► put the key there
4. else
► in a non-empty tree
5.
if k = key[T] then ► do nothing, already in tree
6.
if k < key[T] then TREE-INSERT(left[T], k)
7.
if k > key[T] then TREE-INSERT(right[T], k)
Note: always adds a leaf; O(h) time
Deletion of keys
(Exercise 8)
TREE-DELETE(T, k) ► deletes key k from tree T (O(h) time)
1. if T ≠ nil then ► do nothing if the tree is empty
2. if k < key[T] then TREE-DELETE(left[T], k)
3. if k > key[T] then TREE-DELETE(right[T], k)
4. if k = key[T] then
► found it
5.
if left[T] = nil then T ← right[T] ► bypass on right
6.
elif right[T] = nil then T ← left[T] ► bypass on left
7.
else key[T] ← key[TREE-SUCCESSOR(T)]
► replace deleted key by its successor (or predecessor)
8.
TREE-DELETE(right[T], key[T]) -- or left side
► remove successor (or predecessor) key
Why does this last case only occur once?
© Copyright 2026 Paperzz