Lecture Slides

COSC 3101A - Design and
Analysis of Algorithms
3
Recurrences
Master’s Method
Heapsort and Priority Queue
Many of the slides are taken from Monica Nicolescu’s slides, Univ. Nevada, Reno, [email protected]
Fibonacci numbers
Leonardo Pisano
Born: 1170 in (probably) Pisa (now in Italy)
Died: 1250 in (possibly) Pisa (now in Italy)
1, 1, 2, 3, 5, 8, 13, 21, 34, 55, ...
“A certain man put a pair of rabbits in a
place surrounded on all sides by a wall.
How many pairs of rabbits can be
produced from that pair in a year if it is
supposed that every month each pair
begets a new pair which from the second
month on becomes productive? “
•
F(n) = F(n-1) + F(n-2)
•
F(1) = 0, F(2) = 1
•
F(3) = 1, F(4) = 2, F(5) = 3,
and so on
5/18/2004 Lecture 3
COSC3101A
2
Review: Recursive Algorithms
• A recursive algorithm is an algorithm which
solves a problem by repeatedly reducing the
problem to a smaller version of itself .. until it
reaches a form for which a solution exists.
• Compared with iterative algorithms:
– More memory
– More computation
– Simpler, natural way of thinking about the problem
5/18/2004 Lecture 3
COSC3101A
3
Review: Recursive Example (1)
N! (N factorial) is defined for positive values as:
N! = 0
N! = N * (N-1) * (N-2) * (N-3) * … * 1
For example:
5! = 5*4*3*2*1 = 120
if N = 0
if N > 0
3! = 3*2*1 = 6
The definition of N! can be restated as:
N! = N * (N-1)!
Where 0! = 1
This is the same N!, but it is defined by reducing the problem to a
smaller version of the original (hence, recursively).
5/18/2004 Lecture 3
COSC3101A
4
Review: Recursive Example (2)
int factorial(int num){
if (num == 0) return 1;
else
return num * factorial(num – 1);
}
Assume that factorial is called with the argument 4:
Call to factorial(4) will return  4 * value returned by factorial(3)
Call to factorial(3) will return  3 * value returned by factorial(2)
Call to factorial(2) will return  2 * value returned by factorial(1)
Call to factorial(1) will return  1 * value returned by factorial(0)
Call to factorial(0) returns  1
5/18/2004 Lecture 3
COSC3101A
5
Review: Recursive Example (3)
The call to factorial(0) has returned, so now factorial(1) can finish:
Call to factorial(1) returns  1 * value returned by factorial(0) 1 => 1
The call to factorial(1) has returned, so now factorial(2) can finish:
Call to factorial(2) returns  2 * value returned by factorial(1) 1 => 2
The call to factorial(2) has returned, so now factorial(3) can finish:
Call to factorial(3) returns  3 * value returned by factorial(2) 2 => 6
The call to factorial(3) has returned, so now factorial(4) can finish:
Call to factorial(4) returns  4 * value returned by factorial(3) 6=>24
recursion results in a large number of ‘activation records’ (one per
method call) being placed on the system stack
5/18/2004 Lecture 3
COSC3101A
6
Recurrences
Def.: Recurrence = an equation or inequality that describes a
function in terms of its value on smaller inputs, and one or
more base cases
E.g.: Fibonacci numbers:
• Recurrence: F(n) = F(n-1) + F(n-2)
• Boundary conditions: F(1) = 0, F(2) = 1
• Compute: F(3) = 1, F(4) = 2, F(5) = 3, and so on
In many cases, the running time of an algorithm is
expressed as a recurrence!
5/18/2004 Lecture 3
COSC3101A
7
Recurrences and Running Time
• Recurrences arise when an algorithm contains
recursive calls to itself
• What is the actual running time of the algorithm?
• Need to solve the recurrence
– Find an explicit formula of the expression (the generic
term of the sequence)
5/18/2004 Lecture 3
COSC3101A
8
Typical Recurrences Running Time
• T(n) = T(n-1) + n
Θ(n2)
– Recursive algorithm that loops through the input to
eliminate one item
• T(n) = T(n/2) + c
Θ(lgn)
– Recursive algorithm that halves the input in one step
• T(n) = T(n/2) + n
Θ(n)
– Recursive algorithm that halves the input but must
examine every item in the input
• T(n) = 2T(n/2) + 1
Θ(n)
– Recursive algorithm that splits the input into 2 halves
and does a constant amount of other work
5/18/2004 Lecture 3
COSC3101A
9
Recurrences - Intuition
• For a recurrence of the type:
n
T (n)  aT    f (n)
b
• It takes f(n) to make the processing for the problem of
size n
• The algorithm divides the problem into a subproblems,
each of size n/b
• T(n) = number of subproblems * Running time(n/b) +
processing of the problem of size n
5/18/2004 Lecture 3
COSC3101A
10
Methods for Solving Recurrences
•
•
•
•
Iteration Method
Substitution Method
Recursion Tree Method
Master Method
5/18/2004 Lecture 3
COSC3101A
11
Iteration Method
1. Expand (iterate) the recurrence
2. Express The function as a summation
of terms dependent only on n and the
initial condition
5/18/2004 Lecture 3
COSC3101A
12
Iteration Method – Example(1)
T(n) = c + T(n/2)
T(n) = c + T(n/2)
T(n/2) = c + T(n/4)
= c + c + T(n/4)
T(n/4) = c + T(n/8)
= c + c + c + T(n/8)
Assume n = 2k
T(n) = c + c + … + c + T(1)
k times
= clgn + T(1)
= Θ(lgn)
5/18/2004 Lecture 3
COSC3101A
13
Iteration Method – Example(2)
T(n) = n + 2T(n/2)
Assume: n = 2k
T(n) = n + 2T(n/2)
T(n/2) = n/2 + 2T(n/4)
= n + 2(n/2 + 2T(n/4))
= n + n + 4T(n/4)
= n + n + 4(n/4 + 2T(n/8))
= n + n + n + 8T(n/8)
… = in + 2iT(n/2i)
= kn + 2kT(1)
= nlgn + nT(1) = Θ(nlgn)
5/18/2004 Lecture 3
COSC3101A
14
Substitution method (1)
1. Guess a solution
•
•
Experiences, creativity
iteration method, recursion-tree method
2. Use induction to prove that the solution
works
5/18/2004 Lecture 3
COSC3101A
15
Substitution method (2)
•
Guess a solution
–
T(n) = O(g(n))
–
Induction goal: apply the definition of the asymptotic notation
•
–
•
T(n) ≤ c g(n), for some c > 0 and n ≥ n0
Induction hypothesis: T(k) ≤ c g(k) for all k ≤ n
Prove the induction goal
–
Use the induction hypothesis to find some values of the
constants c and n0 for which the induction goal holds
5/18/2004 Lecture 3
COSC3101A
16
Substitution Method – Example 1-1
T(n) = T(n-1) + n
•
•
Guess: T(n) = O(n2)
–
Induction goal: T(n) ≤ c n2, for some c and n ≥ n0
–
Induction hypothesis: T(k) ≤ ck2 for all k ≤ n
Proof of induction goal:
T(n) = T(n-1) + n ≤ c (n-1)2 + n
= cn2 – (2cn – c - n) ≤ cn2
if: 2cn – c – n ≥ 0  c ≥ n/(2n-1)  c ≥ 1/(2 – 1/n)
–
For n ≥ 1  2 – 1/n ≥ 1  any c ≥ 1 will work
5/18/2004 Lecture 3
COSC3101A
17
Substitution Method – Example 1-2
T(n) = T(n-1) + n
•
Boundary conditions:
– Base case: n0 = 1  T(1) = 1 has to verify condition:
T(1) ≤ c (1)2  1 ≤ c  OK!
•
We can similarly prove that T(n) = (n2) <your
practice>
•
And therefore: T(n) = (n2)
5/18/2004 Lecture 3
COSC3101A
18
Substitution Method – Example 2-1
T(n) = 2T(n/2) + n
•
Guess: T(n) = O(nlgn)
– Induction goal: T(n) ≤ cn lgn, for some c and n ≥ n0
– Induction hypothesis: T(n/2) ≤ cn lg(n/2)
•
Proof of induction goal:
T(n) = T(n/2) + n ≤ 2c (n/2)lg(n/2) + n
= cn lgn – cn + n ≤ cn lgn
if: - cn + n ≤ 0  c ≥ 1
5/18/2004 Lecture 3
COSC3101A
19
Substitution Method – Example 2-2
T(n) = 2T(n/2) + n
•
Boundary conditions:
– Base case: n0 = 1  T(1) = 1 has to verify condition:
T(1) ≤ cn0lgn0  1 ≤ c * 1 * lg1 = 0 – contradiction
– Choose n0 = 2  T(2) = 4 has to verify condition:
T(2) ≤ c * 2 * lg2  4 ≤ 2c  choose c = 2
•
We can similarly prove that T(n) = (nlgn) <your
practice>
•
And therefore: T(n) = (nlgn)
5/18/2004 Lecture 3
COSC3101A
20
Changing variables
T(n) = 2T( n ) + lgn
– Rename: m = lgn  n = 2m
T (2m) = 2T(2m/2) + m
– Rename: S(m) = T(2m)
S(m) = 2S(m/2) + m  S(m) = O(mlgm)
(demonstrated before)
T(n) = T(2m) = S(m) = O(mlgm)=O(lgnlglgn)
Idea: transform the recurrence to one that you
have seen before
5/18/2004 Lecture 3
COSC3101A
21
Recursion-tree method
Convert the recurrence into a tree:
1. Each node represents the cost incurred at
various levels of recursion
2. Sum up the costs of all levels
Used to “guess” a solution for the recurrence
5/18/2004 Lecture 3
COSC3101A
22
Recursion-tree Example 1
W(n) = 2W(n/2) + n2
•
•
•
•
Subproblem size at level i is: n/2i
Subproblem size hits 1 when 1 = n/2i  i = lgn
Cost of the problem at level i = n2/2i
No. of nodes at level i = 2i
i
i
lg n 1 2
lg n 1

Total cost:
n
1
1
1




lg n
2
2
2
W ( n) 
2
i 0
 W(n) =
5/18/2004 Lecture 3
O(n2)
i
 2 W (1)  n
  2 
nn
i 0
COSC3101A
  2 
i 0
 O(n) n
1 1
 O ( n)  2n 2
2
23
Recursion-tree Example 2
E.g.: T(n) = 3T(n/4) + cn2
•
•
•
•
•
Subproblem size at level i is: n/4i
Subproblem size hits 1 when 1 = n/4i  i = log4n
Cost of a node at level i = c(n/4i)2
Number of nodes at level i = 3i  last level has 3log4n = nlog43 nodes
Total cost:
T ( n) 
log4 n 1

i 0
i
 T(n) = O(n2)
5/18/2004 Lecture 3


i





1
3 2
3
log4 3
    cn 2   n log4 3 
cn 2   n log4 3  O(n 2 )
  cn   n
3
 16 
i 0  16 
1
16
COSC3101A
24
Master method
• “Cookbook” for solving recurrences of the form:
where, a ≥ 1, b > 1, and f(n) > 0
n
T (n)  aT    f (n)
b
Case 1: if f(n) = O(nlogba -) for some  > 0, then: T(n) = (nt)
Case 2: if f(n) = (nlogba), then: T(n) = (nlogba lgn)
Case 3: if f(n) = (nlogba +) for some  > 0, and if
af(n/b) ≤ cf(n) for some c < 1 and all sufficiently large n, then:
T(n) = (f(n))
regularity condition
5/18/2004 Lecture 3
COSC3101A
25
Why
nlogba?
• Case 1:
n
T (n)  aT  
b
 n
a 2T  2 
b 
n
a 3T  3 
b 
– If f(n) is dominated by nlogba:
• T(n) = (nlogbn)
• Case 3:
– If f(n) dominates nlogba:
n
T (n)  a iT  i  i
b 
• T(n) = (f(n))
• Case 2:
• Assume n = bk  k = logbn
– If f(n) = (nlogba):
• T(n) = (nlogba logn)
• At the end of iteration i = k:
T (n)  a
 bi 
T  i   a logb nT (1)   a logb n   nlogb a
b 
logb n
5/18/2004 Lecture 3
n
T (n)  aT    f (n)
b

COSC3101A
 

26
Examples (1)
T(n) = 2T(n/2) + n
a = 2, b = 2, log22 = 1
Compare nlog22 with f(n) = n
 f(n) = (n)  Case 2
 T(n) = (nlgn)
5/18/2004 Lecture 3
COSC3101A
27
Examples (2)
T(n) = 2T(n/2) + n2
a = 2, b = 2, log22 = 1
Compare n with f(n) = n2
 f(n) = (n1+) Case 3  verify regularity cond.
a f(n/b) ≤ c f(n)
 2 n2/4 ≤ c n2  c = ½ is a solution (c<1)
 T(n) = (n2)
5/18/2004 Lecture 3
COSC3101A
28
Examples (3)
T(n) = 2T(n/2) +
n
a = 2, b = 2, log22 = 1
Compare n with f(n) = n1/2
 f(n) = O(n1-)
Case 1
 T(n) = (n)
5/18/2004 Lecture 3
COSC3101A
29
Examples (4)
T(n) = 3T(n/4) + nlgn
a = 3, b = 4, log43 = 0.793
Compare n0.793 with f(n) = nlgn
f(n) = (nlog43+) Case 3
Check regularity condition:
3(n/4)lg(n/4) ≤ (3/4)nlgn = c f(n), c=3/4
T(n) = (nlgn)
5/18/2004 Lecture 3
COSC3101A
30
Examples (5)
T(n) = 2T(n/2) + nlgn
a = 2, b = 2, log22 = 1
• Compare n with f(n) = nlgn
– seems like case 3 should apply
• f(n) must be polynomially larger by a factor of n
• In this case it is only larger by a factor of lgn
5/18/2004 Lecture 3
COSC3101A
31
A Job Scheduling Application
• Job scheduling
– The key is the priority of the jobs in the queue
– The job with the highest priority needs to be executed
next
• Operations
– Insert, remove maximum
• Data structures
– Priority queues
– Ordered array/list, unordered array/list
5/18/2004 Lecture 3
COSC3101A
32
Example
5/18/2004 Lecture 3
COSC3101A
33
PQ Implementations & Cost
Worst-case asymptotic costs for a PQ with N items
Insert
Remove max
ordered array
N
1
ordered list
N
1
unordered array
1
N
unordered list
1
N
Can we implement both operations efficiently?
5/18/2004 Lecture 3
COSC3101A
34
Background on Trees
• Def: Binary tree = structure composed of a finite
set of nodes that either:
– Contains no nodes, or
– Is composed of three disjoint sets of nodes: a root
node, a left subtree and a right subtree
root
4
Left subtree
1
2
14
5/18/2004 Lecture 3
3
16 9
Right subtree
10
8
COSC3101A
35
Special Types of Trees
• Def: Full binary tree = a
binary tree in which each
node is either a leaf or has
degree exactly 2.
• Def: Complete binary tree = a
binary tree in which all leaves
have the same depth and all
internal nodes have degree 2.
4
1
3
2
14
16 9
8
10
7
12
Full binary tree
4
1
2
3
16 9
10
Complete binary tree
5/18/2004 Lecture 3
COSC3101A
36
The Heap Data Structure
• Def: A heap is a nearly complete binary tree with
the following two properties:
– Structural property: all levels are full, except
possibly the last one, which is filled from left to right
– Order (heap) property: for any node x
Parent(x) ≥ x
8
7
5
4
2
It doesn’t matter that 4 in
level 1 is smaller than 5 in
level 2
Heap
5/18/2004 Lecture 3
COSC3101A
37
Definitions
• Height of a node = the number of edges on a longest
simple path from the node down to a leaf
• Depth of a node = the length of a path from the root to
the node
• Height of tree = height of root node
= lgn, for a heap of n elements
Height of root = 3
4
1
Height of (2)= 1
2
14
5/18/2004 Lecture 3
3
16 9
10
Depth of (10)= 2
8
COSC3101A
38
Array Representation of Heaps
• A heap can be stored as an
array A.
– Root of tree is A[1]
– Left child of A[i] = A[2i]
– Right child of A[i] = A[2i + 1]
– Parent of A[i] = A[ i/2 ]
– Heapsize[A] ≤ length[A]
• The elements in the subarray
A[(n/2+1) .. n] are leaves
• The root is the maximum
element of the heap
A heap is a binary tree that is filled in order
5/18/2004 Lecture 3
COSC3101A
39
Heap Types
• Max-heaps (largest element at root), have the
max-heap property:
– for all nodes i, excluding the root:
A[PARENT(i)] ≥ A[i]
• Min-heaps (smallest element at root), have the
min-heap property:
– for all nodes i, excluding the root:
A[PARENT(i)] ≤ A[i]
5/18/2004 Lecture 3
COSC3101A
40
Operations on Heaps
• Maintain the max-heap property
– MAX-HEAPIFY
• Create a max-heap from an unordered array
– BUILD-MAX-HEAP
• Sort an array in place
– HEAPSORT
• Priority queue operations
5/18/2004 Lecture 3
COSC3101A
41
Operations on Priority Queues
• Max-priority queues support the following
operations:
– INSERT(S, x): inserts element x into set S
– EXTRACT-MAX(S): removes and returns element of
S with largest key
– MAXIMUM(S): returns element of S with largest key
– INCREASE-KEY(S, x, k): increases value of element
x’s key to k (Assume k ≥ x’s current key value)
5/18/2004 Lecture 3
COSC3101A
42
Maintaining the Heap Property
•
Suppose a node is smaller than a
child
– Left and Right subtrees of i are max-heaps
•
Invariant:
– the heap condition is violated only at that
node
•
To eliminate the violation:
– Exchange with larger child
– Move down the tree
– Continue until node is not smaller than
children
5/18/2004 Lecture 3
COSC3101A
43
Maintaining the Heap Property
Alg: MAX-HEAPIFY(A, i, n)
1. l ← LEFT(i)
– Left and Right
subtrees of i are 2. r ← RIGHT(i)
max-heaps
3. if l ≤ n and A[l] > A[i]
– A[i] may be
4. then largest ←l
smaller than its
5. else largest ←i
children
6. if r ≤ n and A[r] > A[largest]
7. then largest ←r
8. if largest  i
9. then exchange A[i] ↔ A[largest]
10.
MAX-HEAPIFY(A, largest, n)
• Assumptions:
5/18/2004 Lecture 3
COSC3101A
44
Example
MAX-HEAPIFY(A, 2, 10)
A[2]  A[4]
A[2] violates the heap property
A[4] violates the heap property
A[4]  A[9]
Heap property restored
5/18/2004 Lecture 3
COSC3101A
45
MAX-HEAPIFY Running Time
• Intuitively:
– A heap is an almost complete binary tree  must
process O(lgn) levels, with constant work at each
level
• Running time of MAX-HEAPIFY is O(lgn)
• Can be written in terms of the height of the heap,
as being O(h)
– Since the height of the heap is lgn
5/18/2004 Lecture 3
COSC3101A
46
Building a Heap
• Convert an array A[1 … n] into a max-heap (n = length[A])
• The elements in the subarray A[(n/2+1) .. n] are leaves
• Apply MAX-HEAPIFY on elements from 1 to n/2
Alg: BUILD-MAX-HEAP(A)
1
1.
n = length[A]
2.
for i ← n/2 downto 1
3.
4
2
1
8
2
14
A:
4
COSC3101A
3
4
do MAX-HEAPIFY(A, i, n)
5/18/2004 Lecture 3
3
1
3
2
5
6
7
16 9
9
10
8
7
16
9
10 14
10
8
7
47
Example:
A
4
14
7
4
2
6
1
7
16 9
10 8
2
14
5
9
10
8
7
2
7
14
2
9
10
8
7
6
4
16
2
3
16
6
7
16 9
3
5/18/2004 Lecture 3
7
16 9
4
10
8
10 8
5
1
5
10
16 9
3
4
10
i=1
3
9
1
7
1
1
14
6
3
1
2
8
2
3
4
7
1
3
i=2
4
8
4
3
8
10 14
4
5
10
9
1
3
9
16
i=3
1
2
2
i=4
2
8
3
i=5
1
4
1
10
4
8
2
14
2
9
10
8
1
14
5
6
7
7
9
3
COSC3101A
3
10
4
8
2
8
9
10
4
1
5
6
7
7
9
3
48
Correctness of BUILD-MAX-HEAP
• Loop invariant:
– At the start of each iteration of the for loop, each node
i + 1, i + 2,…, n is the root of a max-heap
• Initialization:
– i = n/2: Nodes n/2+1, n/2+2, …, n are leaves 
they are the root of trivial max-heaps
1
4
2
3
1
3
4
8
14
5/18/2004 Lecture 3
5
2
9
10
8
7
6
16 9
COSC3101A
7
10
49
Correctness of BUILD-MAX-HEAP
• Maintenance:
– MAX-HEAPIFY makes node i a maxheap root and preserves the property
that nodes i + 1, i + 2, …, n are roots of
max-heaps
– Decrementing i in the for loop
8
reestablishes the loop invariant
14
• Termination:
1
4
2
3
1
3
4
2
5
9
10
8
7
6
7
16 9
10
– i = 0  each node 1, 2, …, n is the root
of a max-heap (by the loop invariant)
5/18/2004 Lecture 3
COSC3101A
50
Running Time of BUILD-MAX-HEAP
Alg: BUILD-MAX-HEAP(A)
1.
n = length[A]
2.
for i ← n/2 downto 1
3.
do MAX-HEAPIFY(A, i, n)
O(lgn)
O(n)
 Running time: O(nlgn)
• This is not an asymptotically tight upper bound
5/18/2004 Lecture 3
COSC3101A
51
Running Time of BUILD-MAX-HEAP
• HEAPIFY takes O(h)  the cost of HEAPIFY on a node i is
proportional to the height of the node i in the tree
Height
Level
No. of nodes
h0 = 3 (lgn)
i=0
20
h1 = 2
i=1
21
h2 = 1
i=2
22
h3 = 0
i = 3 (lgn)
23
hi = h – i height of the heap rooted at level i
ni = 2i
number of nodes at level i
5/18/2004 Lecture 3
COSC3101A
h
 T (n)   ni hi
i 0
52
Running Time of BUILD-MAX-HEAP
h
T (n)   ni hi
Cost of HEAPIFY at level i  number of nodes at that level
i 0
h
  2i h  i 
Replace the values of ni and hi computed before
i 0
hi h
2
h i
i 0 2
h
k
h
2  k
k 0 2
h


k
k
2
k 0
Multiply by 2h both at the nominator and denominator and
write 2i as 1 i
2
Change variables: k = h - i
 n
The sum above is smaller than the sum of all elements to 
and h = lgn
 O (n)
The sum above is smaller than 2
Running time of BUILD-MAX-HEAP: T(n) = O(n)
5/18/2004 Lecture 3
COSC3101A
53
Heapsort
• Goal:
– Sort an array using heap representations
• Idea:
– Build a max-heap from the array
– Swap the root (the maximum element) with the last
element in the array
– “Discard” this last node by decreasing the heap size
– Call MAX-HEAPIFY on the new root
– Repeat this process until only one node remains
5/18/2004 Lecture 3
COSC3101A
54
Alg: HEAPSORT(A)
1. BUILD-MAX-HEAP(A)
O(n)
2. for i ← length[A] downto 2
3.
MAX-HEAPIFY(A, 1, i - 1)
4.
•
n-1 times
do exchange A[1] ↔ A[i]
O(lgn)
Running time: O(nlgn)
5/18/2004 Lecture 3
COSC3101A
55
Example:
MAX-HEAPIFY(A, 1, 4)
A=[7, 4, 3, 1, 2]
MAX-HEAPIFY(A, 1, 3)
MAX-HEAPIFY(A, 1, 2)
MAX-HEAPIFY(A, 1, 1)
5/18/2004 Lecture 3
COSC3101A
56
HEAP-MAXIMUM
Goal:
– Return the largest element of the heap
Alg: HEAP-MAXIMUM(A)
1.
Running time: O(1)
return A[1]
Heap A:
Heap-Maximum(A) returns 7
5/18/2004 Lecture 3
COSC3101A
57
HEAP-EXTRACT-MAX
Goal:
–
Extract the largest element of the heap (i.e., return the max
value and also remove that element from the heap
Idea:
–
Exchange the root element with the last
–
Decrease the size of the heap by 1 element
–
Call MAX-HEAPIFY on the new root, on a heap of size n-1
Heap A:
5/18/2004 Lecture 3
Root is the largest element
COSC3101A
58
HEAP-EXTRACT-MAX
Alg: HEAP-EXTRACT-MAX(A, n)
1. if n < 1
2.
then error “heap underflow”
3.
max ← A[1]
4.
A[1] ← A[n]
5.
MAX-HEAPIFY(A, 1, n-1)
6.
return max
remakes heap
Running time: O(lgn)
5/18/2004 Lecture 3
COSC3101A
59
Example: HEAP-EXTRACT-MAX
16
14
7
4
max = 16
10
8
2
1
9
14
3
10
8
1
2
7
9
3
4
Heap size decreased with 1
14
Call MAX-HEAPIFY(A, 1, n-1)
8
4
2
5/18/2004 Lecture 3
10
7
9
3
1
COSC3101A
60
HEAP-INCREASE-KEY
• Goal:
– Increases the key of an element i in the heap
• Idea:
– Increment the key of A[i] to its new value
– If the max-heap property does not hold anymore:
traverse a path toward the root to find the proper
place for the newly increased key
16
14
Key [i] ← 15 2
5/18/2004 Lecture 3
8 i
4
10
7
9
3
1
COSC3101A
61
HEAP-INCREASE-KEY
Alg: HEAP-INCREASE-KEY(A, i, key)
1.
2.
3.
4.
5.
6.
•
if key < A[i]
then error “new key is smaller than current key”
A[i] ← key
16
while i > 1 and A[PARENT(i)] < A[i]
do exchange A[i] ↔ A[PARENT(i)]
14
i ← PARENT(i)
2
Running time: O(lgn)
5/18/2004 Lecture 3
8 i
4
7
10
9
3
1
Key [i] ← 15
COSC3101A
62
Example: HEAP-INCREASE-KEY
16
14
2
8 i
4
16
10
7
9
14
3
1
2
10
8 i
7
15 1
9
3
Key [i ] ← 15
16
i
15
2
14
5/18/2004 Lecture 3
i
15
10
7
8
16
9
3
10
14
1
2
COSC3101A
7
8
9
3
1
63
MAX-HEAP-INSERT
• Goal:
16
– Inserts a new element into a maxheap
10
8
• Idea:
– Expand the max-heap with a new
element whose key is -
– Calls HEAP-INCREASE-KEY to
set the key of the new node to its
correct value and maintain the
max-heap property
5/18/2004 Lecture 3
14
COSC3101A
2
7
4
1
9
3
-
16
14
10
8
2
7
4
1
9
3
15
64
MAX-HEAP-INSERT
16
Alg: MAX-HEAP-INSERT(A, key, n)
14
1. heap-size[A] ← n + 1
8
2
2. A[n + 1] ← -
10
7
4
1
9
3
-
3. HEAP-INCREASE-KEY(A, n + 1, key)
Running time: O(lgn)
5/18/2004 Lecture 3
COSC3101A
65
Example: MAX-HEAP-INSERT
Insert value 15:
- Start by inserting -
Increase the key to 15
Call HEAP-INCREASE-KEY on A[11] = 15
16
16
14
8
2
7
4
14
10
1
9
8
3
2
-
10
7
4
1
9
3
15
The restored heap containing
the newly added element
16
16
14
8
2
15 9
4
15
10
1
5/18/2004 Lecture 3
7
10
8
3
2
COSC3101A
14 9
4
1
3
7
66
Summary
• We can perform the following operations on
heaps:
– MAX-HEAPIFY
O(lgn)
– BUILD-MAX-HEAP
O(n)
– HEAP-SORT
O(nlgn)
– MAX-HEAP-INSERT
O(lgn)
– HEAP-EXTRACT-MAX
O(lgn)
– HEAP-INCREASE-KEY
O(lgn)
– HEAP-MAXIMUM
O(1)
5/18/2004 Lecture 3
COSC3101A
67
Readings
• Chapter 8, 6
5/18/2004 Lecture 3
COSC3101A
68