CS 124: Data Structures & Algorithms
Spring 2014
Lecture 16: February 26
Lecturer: Robert Snapp
Scribe: Ari Larson
Note: LaTeX template courtesy of UC Berkeley EECS dept.
Disclaimer: These notes have not been subjected to the usual scrutiny reserved for formal publications.
They may be distributed outside this class only with the permission of the Instructor.
Note: Programmiong Pearls by Jon Bently a book worth reading. Especially the chapter on max subarry
problem, in which he produces an algorithm which runs in O(nlgn)
Link to PDF version: http://www.it.iitb.ac.in/∼deepak/deepak/placement/Programming pearls.pdf
16.1
Section 4.3 Substitution method
Recall the Towers of Hanoi problem from before the exam, with recurrence equation:
T (n) =
1
if n = 1
2T (n + 1) + 1 if n > 1
Recall also that there are some situations in which the cost of recursion stack implementation outweighs the
benefit of recursive algorithms. Conceptually, recursion implements the concept of self-similarity:
Abstractly, recursive algorithms can be represented as a ”tree” type hierarchical structure composed of
subtrees.
Visually, we see ”sub-graphs within the graph”
a
c
b
or ”sublists within the list”:
d
e
f
g
We can exploit this feature of self-similarity to accelerate the solution process.
Divide and Conquer
Recall the merge sort algorithm:
16-1
16-2
Lecture 16: February 26
Merge-Sort(A, p, r)
1 if p < r
p + r
2
q =
2
3
Merge-Sort(A, p, q)
4
Merge-Sort(A, q + 1, r)
5
Merge(A, p, q, r)
Our current objective is to quantify running time for recursive algorithms like this one.
The substitution method is used to prove the running time for a recurrence equation once a suitable hypothesis for its runtime has already been determined.
For merge sort, n = r − p + 1, and cost T (n) = T b n2 c + T d n2 e + Θ(n)
A homework problem is to show that T (n) = Θ(n lg n) using the substitution method
In class today we will work with a similar but simpler problem:
T (n) =
1
2T b n2 c + n
if n = 1
if n > 1
Proof: Using substitution method
Our hypothesis for this problem is that the cost T (n) = Θ(n lg n)
In order to prove this, we will first prove that T(n) = On lg n) and then prove that T(n) = Ωn lg n)
For the substitution method we use strong mathematical induction. Previously in class we have used weak
induction, in which we prove P1 (the base case) and that Pn =⇒ Pn+1
Strong Mathematical Induction
• Assume P1
• Inductive step: P1 ∧ P2 ∧ P3 . . . ∧ Pn =⇒ Pn+1
• Go back to prove base case P1
(In this case we need to use strong mathematical induction because we are using Pb n2 c which we know lies
somewhere in P1 ∧ P2 ∧ P3 . . . ∧ Pn .)
Our first objective is to show that T (n) = O(n lg n):
We use the definition of O(g(n)):
O(g(n)) = {f (n) : there exist positive constants c and n0 such that
0 ≤ f (n) ≤ cg(n), for all n ≥ n0 }.
We assume that there exist c > 0 and n0 > 0 such that T (n) ≤ c n lg n for all n ≥ n0 and that n > 1
(Our assumed base case is actually P2 , that T (2) ≤ c · 2 lg 2)) = 2c
n T (n) = 2 T b c + n
2
Lecture 16: February 26
16-3
We can argue that T(b n2 c) ≤ cb n2 c lgb n2 c
We know that b n2 c < n
So by substitution
n
n
T (n) ≤ 2cb c lgb c + n,
2
n n2
+ n,
≤2
c lg
2 2
≤ c n(lg n − lg 2) + n
(Note: x − 1 < bxc ≤ x )
≤ c n(lg n − 1) + n
≤ c n lg n − c n + n
Here we ask: for what values of c > 0 is −c n + n ≤ 0 ? =⇒ c ≥ 1
So we have shown that T (n) ≤ c n lg n for c ≥ 1
Now we return to prove the base case which we assumed earlier:
We pick an n0 :
We can use either n0 = 2 or n0 = 3 because T (b n2 )c) must equal T(1). We choose n0 = 3, and ensure that
our choice of c satisfies
T (3) = 2T (1) + 3 ≤ c3 lg 3.
This requires that c ≥
T (3)
3 lg 3 .
Combining this with the above, we obtain,
T (3)
c ≥ max 1,
.
3 lg 3
Alternatively, we could have chosen n0 = 2, in which case,
T (2)
.
c ≥ max 1,
2
So we we have proved by PMI (strong) that T (n) ≤ c lg n =⇒ T (n) = O(n lg n)
In order to complete the proof by substitution, we would need to also prove that T (n) = Ω(n lg n), but we
ran out of time during this class period.
© Copyright 2026 Paperzz