Master method - Divide and Conquer

Divide-and-Conquer

Divide: the problem into subproblems

Conquer: solve the subproblems, often
by recursion

Combine: the solutions to the
subproblems to get the solution for the
entire problem.
Complexity of Divide-andConquer

T(n) = d + a * T(s) + c
– cost of dividing = d
– cost of combining = c
– number of subproblems = a
– size of the subproblems = s
(assume subproblems are all approximately
the same size).
MergeSort(A,left,right)
if (left < right) {
mid = (left + right)/2;
MergeSort(A, left, mid);
MergeSort(A,mid+1,right);
Merge(A,left,mid,right);
}
Complexity of MergeSort
Cost of Dividing: O(1)
 Cost of Combining: Merge, O(n)
 Number of Subproblems: 2
 Size of Subproblems: n/2
T(1) = 1
T(n) = 1 + 2*T(n/2) + n
T(n) = 2 * T(n/2) + n

Solving the Recurrence
T(1) = 1
 T(n) = 2*T(n/2) + n
step2 = 2(2*T(n/4)+n/2)+n = 4T(n/4)+2n
step3 = 4(2*T(n/8)+n/4)+2n= 8T(n/8)+3n
stepk = 2^k T(n/2^k)+kn
when k = log2n; nT(1) + nlog2n
= n + nlog2n  O(nlogn)

Homework

Write an algorithm to count the number
of inversions in an array (see problem
2-4 Inversions, on page 41).
Binary Search(A, n, key)
left = 1; right = n;
while (left <= right) {
mid = (left+right)/2;
if (A[mid] < key)
left = mid+1;
else if (A[mid] > key)
right = mid -1;
else return mid;
} return not_found;
Binary Search Complexity
Divide = O(1)
 Combine = 0
 Number of Subproblems = 1
 Size of Subproblem: n/2
T(n) = 1 + T(n/2)
Solve: O(logn)

Quicksort (A, left, right)
if (right - left > cutoff)
1. Choose a pivot.
2. Move all elements < pivot to the left.
3. Move all elements > pivot to the right.
4. Put the pivot in the correct place.
5. Quicksort(A,left,pivot_pos-1)
6. Quicksort(A,pivot_pos+1,right)
Master Method (Ch. 4)



T(n) = a * T(n/b) + f(n)
majority of work is done at the top level
complexity is top level complexity
majority of work is done in the leaves
complexity is work at leaves
same amount of work at each level
num_levels * work at each level
Master Method (cont)
T(n) = a * T(n/b) + f(n)
1. Let p = log (base b) a
2. *Compare n^p and f(n)
Case 1: n^p > f(n). Complexity O(n^p)
Case 2: f(n) > n^p. Complexity O(f(n))
Case 3: f(n) = n^p. Com. O(f(n)*logn)
* a simplification
Homework:



From before: problem 1.1 p.13 (could
write a short program that zeros in on
the correct value. We’ll discuss)…
others
Problem 4.1, pg. 85 (not all can use the
master method) 4.1(h) is harder.
Problem 2.4: pg. 39,40. a-d.
(to cover in class on Wednesday)
Maximum Non-decreasing
Subsequence
Input: Array A
Output: The length of the longest nondecreasing sequence in the array.
12,16,2,54,13,7,9,-8,14,13,17,13,15
Answer: Length 6
2,7,9,13,13,15