SlideShare a Scribd company logo
1 of 74
Imam Abdulrahman Bin Faisal University
College of Computer Science and Information Technology
Department of Computer Science
CS 412 Algorithm Analysis and Design
Term 1 2022-2023
Chapter 4 – Part # 1
Divide-and-Conquer
Outline
 Introduction
 Recurrences and Running Time
 The divide-and-conquer design paradigm
 Solving Recurrences:
 Iteration method
 Substitution method
 Recursion tree method
 Master Theorem/Method
2
Introduction
 We saw how merge sort serves as an example of the divide-and-
conquer paradigm.
 The analysis of merge sort from previous lecture required us to
solve a recurrence. T(n)=2T(n/2)+cn
 In this chapter, we shall see more algorithms based on divide-and-
conquer.
 Binary search.
 Two divide-and-conquer algorithms for multiplying n*n
matrices.
 Then we will discuss different ways to solve Recurrences.
 Iteration method
 Substitution method
 Recursion tree method
 Master method
3
The divide-and-conquer design paradigm
1. Divide the problem (instance) into subproblems.
2. Conquer the subproblems by solving them recursively.
3. Combine subproblem solutions.
4
Merge sort
1. Divide: Trivial.
2. Conquer: Recursively sort 2 subarrays.
3. Combine: Linear-time merge.
5
Merge sort
T(n) = 2 T(n/2) + (n)
# subproblems
subproblem size
work dividing and
combining
1. Divide: Trivial.
2. Conquer: Recursively sort 2 subarrays.
3. Combine: Linear-time merge.
6
Binary search
Example: Find 9
3 5 7 8 9 12 15
Find an element in a sorted array:
1. Divide: Check middle element.
2. Conquer: Recursively search 1 subarray.
3. Combine: Trivial.
7
Binary search
Find an element in a sorted array:
1. Divide: Check middle element.
2. Conquer: Recursively search 1 subarray.
3. Combine: Trivial.
3 5 7 8 9 12 15
Example: Find 9
8
Binary search
Find an element in a sorted array:
1. Divide: Check middle element.
2. Conquer: Recursively search 1 subarray.
3. Combine: Trivial.
Example: Find 9
3 5 7 8 9 12 15
9
Binary search
Find an element in a sorted array:
1. Divide: Check middle element.
2. Conquer: Recursively search 1 subarray.
3. Combine: Trivial.
Example: Find 9
3 5 7 8 9 12 15
10
Binary search
Find an element in a sorted array:
1. Divide: Check middle element.
2. Conquer: Recursively search 1 subarray.
3. Combine: Trivial.
Example: Find 9
3 5 7 8 9 12 15
11
Binary search
Find an element in a sorted array:
1. Divide: Check middle element.
2. Conquer: Recursively search 1 subarray.
3. Combine: Trivial.
Example: Find 9
3 5 7 8 9 12 15
12
 for an ordered array A, finds if x is in the array A[lo…hi]
Alg.: BINARY-SEARCH (A, lo, hi, x)
if (lo > hi)
return FALSE
mid  (lo+hi)/2
if x = A[mid]
return TRUE
if ( x < A[mid] )
BINARY-SEARCH (A, lo, mid-1, x)
if ( x > A[mid] )
BINARY-SEARCH (A, mid+1, hi, x)
12
11
10
9
7
5
3
2
1 2 3 4 5 6 7 8
mid
lo hi
Binary search
13
Example
 A[8] = {1, 2, 3, 4, 5, 7, 9, 11}
lo = 1 hi = 8 x = 7
mid = 4, lo = 5, hi = 8
mid = 6, A[mid] = x
Found!
11
9
7
5
4
3
2
1
11
9
7
5
4
3
2
1
1 2 3 4 5 6 7 8
8
7
6
5
14
Another Example
 A[8] = {1, 2, 3, 4, 5, 7, 9, 11}
lo = 1 hi = 8 x = 6
mid = 4, lo = 5, hi = 8
mid = 6, A[6] = 7, lo = 5, hi = 5
11
9
7
5
4
3
2
1
11
9
7
5
4
3
2
1
1 2 3 4 5 6 7 8
11
9
7
5
4
3
2
1 mid = 5, A[5] = 5, lo = 6, hi = 5
NOT FOUND!
11
9
7
5
4
3
2
1
low high
low
low
high
high
15
Analysis of BINARY-SEARCH
Alg.: BINARY-SEARCH (A, lo, hi, x)
if (lo > hi)
return FALSE
mid  (lo+hi)/2
if x = A[mid]
return TRUE
if ( x < A[mid] )
BINARY-SEARCH (A, lo, mid-1, x)
if ( x > A[mid] )
BINARY-SEARCH (A, mid+1, hi, x)
 T(n) = c +
• T(n) – running time for an array of size n
constant time: c2
same problem of size n/2
same problem of size n/2
constant time: c1
constant time: c3
T(n/2)
16
Recurrence for binary search
T(n) = 1 T(n/2) + Q(1)
N=4
# subproblems
subproblem size
work dividing and
combining
17
T(n) = 1 T(n/2) + Q(1)
n=4
T(4)=T(2)+Q(1)
T(4)=T(1)+Q(1) +Q(1)
T(4)=2Q(1)+T(1)
Recurrences and Running Time
 Recurrences go hand in hand with the divide-and-conquer paradigm,
because they give us a natural way to characterize the running times
of divide-and-conquer algorithms.
 Recurrences arise when an algorithm contains recursive calls to itself
 It is an equation or inequality that describes a function in terms of its
value on smaller inputs.
 Recurrences can take many forms.
o A recursive algorithm might divide subproblems into unequal sizes, such
as a 2/3-to-1/3 split. If the divide and combine steps take linear time,
such an algorithm would give rise to the recurrence
T(n) = T(2n/3) + T(n/3) + Θ(n).
o Subproblems are not necessarily constrained to being a constant
fraction of the original problem size. For example, a recursive
version of linear search yields the recurrence T(n) = T(n-1) + Θ(1) 18
Example Recurrences
 T(n) = T(n-1) + n Θ(n2)
o Recursive algorithm that loops through the input to eliminate
one item
 T(n) = T(n/2) + c Θ(log n)
o Recursive algorithm that halves the input in one step
 T(n) = T(n/2) + n Θ(nlgn)
o Recursive algorithm that halves the input but must examine
every item in the input
 T(n) = 2T(n/2) + c Θ(lgn)
o Recursive algorithm that splits the input into 2 halves and
does a constant amount of other work
19
Recurrences and Running Time
 What is the actual running time of the algorithm?
 Need to solve the recurrence (obtaining asymptotic “Θ” or “O” bounds on the solution)
o Find an explicit formula of the expression
o Bound the recurrence by an expression that involves n
20
Methods for Solving the recurrence
21
 This chapter offers four methods for solving recurrences
1. The iteration method
 Convert the recurrence into a summation and try to bound it using known series
2. The substitution method
 Guess a bound and then use mathematical induction to prove our guess correct.
3. The recursion-tree method
 converts the recurrence into a tree whose nodes represent the costs incurred at various levels of the
recursion. We use techniques for bounding summations to solve the recurrence.
4. The master method
 provides bounds for recurrences of the form : T(n) = aT(n/b) + f(n). where a >= 1, b > 1, and f(n) is a
given function.
The Iteration method for solving
recurrences
22
The Iteration Method
 Convert the recurrence into a summation and try to bound
it using known series
o Iterate the recurrence until the initial condition is
reached.
o Use back-substitution to express the recurrence in terms
of n and the initial (boundary) condition.
23
T(n) = c + T(n/2)
T(n) = c + T(n/2)
= c + c + T(n/ 22)
= c + c + c + T(n/ 23)
........
= c + c + c+…+ T(n/ 2k)
Assume n = 2k --- as base case when n / 2k =1 n = 2k log n = k
T(n) = c + c + … + c + T(1)
T(n)=c k + T(1)
= c lgn + T(1) ; k=log n
= Θ(lgn)
k times
T(n/2) = c + T(n/ 22)
T(n/4) = c + T(n/ 23)
24
Iteration Method – Example T(n) = c if n=1
c + T(n/2) if n>1
Iteration Method – Example
T(n) = n + 2T(n/2)
T(n) = n + 2 T(n/2)
= n + 2(n/2 + 2T(n/ 22))
= n + n + 22 T(n/ 22)
= n + n + 22(n/4 + 2T(n/ 23))
= n + n + n + 23 T(n/ 23)
…………………
= kn + 2kT(n/2k)
-- Assume n = 2k --- as base case when n / 2k =1 n = 2k log n = k
= kn + 2kT(1)
= nlgn + nT(1) = Θ(n lgn)
T(n/2) = n/2 + 2T(n/22)
25
T(n) = c if n=1
n + 2T(n/2) if n>1
T(n/22) = n/22 + 2T(n/23)
Example
T(n) = T(n-1) + n
 T(n) = T(n-1) + n
 T(n) = T(n-2)+n+n
 T(n) = T(n-3)+n+n+n
 T(n) = T(n-k)+kn
n-k=1
n=1+k
k=n-1
T(n) = T(1)+(n-1)n = n^2 – n + T(1) = theta (n^2)
26
Example
T(n) = T(n-2) + c
T(n) = T(n-1) + c
T(n) = T(n-2) + c + c
T(n) = T(n-3) + c + c + c
…
T(n) = T(n-k) + kc
n-k=1
k = n-1
T(n) = T(1) + (n-1) c
T(n) = Θ(n)
27
T(n)=T(n-2)+c
T(n)=T(n-2.2)+c+c
T(n)=T(n-2.3)+c+c+c
T(n)=T(n-2.k)+kc
n-2k=1
2k=n
k=n/2
T(n)=T(1)+cn/2
T(n) = Θ(n)
The substitution method for solving
recurrences
28
Substitution method
 Now that we have seen how recurrences characterize the running times of
divide-and-conquer algorithms, we will learn how to solve recurrences using
the “substitution” method.
 Its the most general method.
 comprises two steps:
1. Guess the form of the solution.
2. Verify by mathematical induction to find the constants and show that the
solution works.
 We substitute the guessed solution for the function when applying the inductive
hypothesis to smaller values; hence the name “substitution method.”
 This method is powerful, but we must be able to guess the form of the answer
in order to apply it.
 We can use the substitution method to establish either upper (big-Oh) or lower
(big-Omega) bounds on a recurrence
29
Substitution method
1. Guess a solution
 T(n) = O(g(n))
 Induction goal: apply the definition of the asymptotic notation
 T(n) ≤ d g(n), for some d > 0 and n ≥ n0
 Induction hypothesis: T(k) ≤ d g(k) for all k < n
2. Prove the induction goal
 Use the induction hypothesis to find some values of the constants d and n0
for which the induction goal holds
(strong induction)
30
Example 1: Merge sort
T(n) = 2T(n/2) + n
 Guess: T(n) = O(nlgn) ; g(n)=nlgn
 Induction goal: T(n) ≤ c n lgn, for some c and n ≥ n0
 Induction hypothesis: T(n/2) ≤ c n/2 lg(n/2)
 Proof of induction goal:
T(n) = 2T(n/2) + n ≤ 2c (n/2)lg(n/2) + n
T(n) ≤ 2( c.(n/2) log.(n/2) ) + n
≤ cn log.(n/2) + n
= cn log n - cn log 2 + n
= cn lgn – cn + n = cn lgn - cn lgn – n(c-1) ≤0
if: - cn + n ≤ 0  n(c-1) ≥ 0  c ≥ 1;n ≥ 0
 Base case? Next page
We guess that the solution is T(n) = O(n log n)
The substitution method requires us to
prove that T(n) = O(n log n) for an
appropriate choice of the constant c > 0.
We start by assuming that this bound holds for
all positive m < n, in particular for m = (n/2)
Substituting: T(n/2) <= c.(n/2) log.(n/2)
into the recurrence
31
T(n) = 2T(n/2) + n
 T(n) = 2 T(n/2) + n
 T(n) ≤ 2 c n/2 lg(n/2) + n = cn lg(n/2)+n = cn (lgn-lg2) + n =cn (lgn-1) +n =cnlgn – cn +n
 T(n) ≤ cn lgn – n(c-1) ≤ cn lgn
-n(c-1) ≤0
n (c-1) ≥ 0; n ≥0; c ≥1
32
Example 1: Merge sort
T(n) = 2T(n/2) + n
 Guess: T(n) = O(nlgn)
 Induction goal: T(n) ≤ c n lgn, for some c and n ≥ n0
 Induction hypothesis: T(n/2) ≤ cn/2 lg(n/2)
 Proof of induction goal:
T(n) = 2T(n/2) + n ≤ 2c (n/2)lg(n/2) + n
T(n) ≤ 2( c.(n/2) log.(n/2) ) + n
≤ cn log.(n/2) + n
= cn log n - cn log 2 + n
= cn lgn – cn + n ≤ cn lgn
if: - cn + n ≤ 0  c ≥ 1
 Base case? n=2
• Mathematical induction requires us to show
that our solution holds for the boundary
conditions.
• We do so by showing that the boundary
conditions are suitable as base cases for the
inductive proof.
• For this recurrence, we must show that we can
choose the constant c large enough so that the
bound: T(n) <= cn log n works for the boundary
conditions as well.
• BUT: T(1) = 1 is the sole boundary condition of
the recurrence. Then for n = 1, the bound T(n)
<= cn log n yields T(1) <= c log 1 = 0
• So, we take advantage of asymptotic notation
requiring us only to prove T(n) <= cn log n for n
>= n0 where n0 is a constant that we get to
choose.
• Extend boundary conditions to make the
inductive assumption work for small n .
• T(2) = 4
• we can complete the inductive proof that
T(n) <= cn log n for some constant c >= 1
by choosing c large enough so that T(2) <=
c2 log 2
• So, any choice of c >= 2 suffices for the
base cases of n = 2
33
Example 2: Binary Search
T(n) = T(n/2) + c
 Guess: T(n) = O(lgn)
 Induction goal: T(n) ≤ d lgn, for some d and n ≥ n0
 Induction hypothesis: T(n/2) ≤ d lg(n/2)
 Proof of induction goal:
T(n) = T(n/2) + c ≤ d lg(n/2) + c
= d lgn – d lg2 + c = d lgn – d + c ≤ d lgn
if: – d + c ≤ 0, d ≥ c
 Base case? n=2
34
Example 3
T(n) = T(n-1) + n
 Guess: T(n) = O(n2)
 Induction goal: T(n) ≤ c n2, for some c and n ≥ n0
 Induction hypothesis: T(n-1) ≤ c(n-1)2 for all k < n
 Proof of induction goal:
T(n) = T(n-1) + n ≤ c (n-1)2 + n
= c(n2-2n+1) + n
= cn2 – (2cn – c - n) ≤ cn2 (see next page)
if: 2cn – c – n ≥ 0  c ≥ n/(2n-1)  c ≥ 1/(2 – 1/n)
 For n ≥ 1  2 – 1/n ≥ 1  any c ≥ 1 will work 35
Detail of Example 3
 – (2cn – c - n) ≤0
 2cn – c - n ≥ 0
 c(2n-1)-n ≥ 0
 c(2n-1) ≥ n
 c ≥ n/(2n-1) ≥ 1/(2-1/n)
 n ≥ 1; c ≥ 1
36
Changing variables
 Rename: m = lgn  n = 2m  𝑛=n^(1/2)=2^m/2
T (2m) = 2T(2m/2) + m
 Rename: S(m) = T(2m)
S(m) = 2S(m/2) + m  S(m) = O(mlgm) (demonstrated before)
T(n) = T(2m) = S(m) = O(mlgm)=O(lgn lg lgn)
 Guess: S(m) = O(lgn lg lgn)
 Induction goal: S(n) ≤ c (lgn lglgn) , for some c and n ≥ n0
 Induction hypothesis: S(n/2) ≤ c lg(n/2) lg lg(n/2)
 Proof of induction goal:
S(n) = 2S(2n/2) + n ≤ c lg(n/2) lg lg(n/2) + n
Idea: transform the recurrence to one that you have seen before
T(n) = 2T( ) + lgn
𝑛
37
The recursion-tree method for solving
recurrences
38
The recursion-tree method
Convert the recurrence into a tree:
 Each node represents the cost incurred at various levels of recursion
 Sum up the costs of all levels
 The recursion tree method is good for generating guesses for the
substitution method.
39
40
Example 1
E.g.: T(n) = 8T(n/2) + n2
= n2 (2logn – 1) + n3
= n2 (nlog2 -1) + n3
= n2 (n -1) + n3
= n3 - n2 + n3 T(n) = (n3)
8i T(n/2i)
T(n/2) = 8 T(n/22) + (n/2)2
T(n/22) = 8 T(n/23) + (n/2)3
T(n) = 1 n=1
8T(n/2) + n2 n>1
1. Iteration method:
41
Example 1
= n2 (2logn – 1) + n3
= n2 (nlog2 -1) + n3
= n2 (n -1) + n3
= n3 - n2 + n3 T(n) = (n3)
E.g.: T(n) = 8T(n/2) + n2 2. Recursion Tree method:
Example 2
E.g.: T(n) = 3T(n/4) + cn2
42
T(n) = 1 n=1
3T(n/4) + cn2 n>1
T(n)=T(n/4)+ T(n/4)+ T(n/4)+cn^2=3T(n/4)+cn^2
T(n/4)=T(n/16)+ T(n/16)+ T(n/16)+c(n/4)^2=3T(n/16)+c(n/4)^2
Example 2
E.g.: T(n) = 3T(n/4) + cn2
𝑇 𝑛 =
𝑖=0
log4 𝑛−1
3
16
𝑖
𝑐𝑛2 + Θ 𝑛log4 3 ≤
𝑖=0
∞
3
16
𝑖
𝑐𝑛2 + Θ 𝑛log4 3 =
1
1 −
3
16
𝑐𝑛2 + Θ 𝑛log4 3 = 𝑂(𝑛2)
• Subproblem size at level i is: n/4i
• Subproblem size hits 1 when 1 = n/4i  i = log4n
 last level (H) H = log4n
• Cost at base level (level H):
 last level (H) has 3log
4
n = nlog
4
3 nodes each
contributing cost T(1).
 So cost = T(1).n log
4
3 = (nlog
4
3 ).
• Number of nodes at level i = 3i
• Cost of one problem (single node) at level i =
c(n/4i)2 = (1/16i) cn2
• Cost of the problem (all nodes) at level i : 0, 1,
2; …., log4(n – 1) = 3i(n/4i)2 cn2 = (3/16)i cn2
• Total cost:
 T(n) = O(n2) 43
• This last formula looks somewhat messy .
• We can again use an infinite decreasing geometric series as an upper bound.
• equation (A.6),
44
Instructor Notes:
T(n)=cn^2+(3/16)cn2+(3/16)^2cn2+….
T(n)=cn^2[1+(3/16)+(3/16)^2+(3/16)^3+….]
Geometric Progression r=(3/16)
𝑖=0
∞
𝑟 𝑖=1/1-r;|r|<1
T(n)=cn^2[1/(1-3/16)]
T(n)=cn^2(16/13)=θ(n^2)
Example 2 Explanation
 Because subproblem sizes decrease by a factor of
4 each time we go down one level, we eventually
must reach a boundary condition.
 How far from the root do we reach one?
 The subproblem size for a node at depth i is n/4i
. Thus, the subproblem size hits n = 1 when n=4i
= 1 or, equivalently, when i = log4 n.
 Thus, the tree has log4 n + 1 levels (at depths 0;
1; 2; : : : ; log4 n).
 Next, we determine the cost at each level ..
 Each level has three times more nodes than the
level above, and so the number of nodes at
depth i is 3i .
 Because subproblem sizes reduce by a factor of 4 for
each level we go down from the root, each node at
depth i, for i = 0; 1; 2; : : : ; log4 n -1, has a cost of
c(n=4i)2. Multiplying, we see that the total cost over all
nodes at depth i, for i = 0; 1; 2; : : : ; log4 n - 1, is
3ic(n=4i)2 = (3/16)i cn2
 The bottom level, at depth log4n, has 3 log
4
n = n log
4
3 =
nodes, each contributing cost T(1), for a total cost of
T(1) . n log
4
3 which is ( n log
4
3 )since we assume that
T(1) is a constant.
 Now we add up the costs over all levels to determine
the cost for the entire tree:
 Thus, we have derived a guess of T(n) = O(n2) for our
original recurrence. In this example, the coefficients
of cn2 form a decreasing geometric series and the sum
of these coefficients is bounded from above by the
constant 16/13 (see Eq. A.6 on previous slide).
 Since the root’s contribution to the total cost is cn2,
the root contributes a constant fraction of the total
cost. In other words, the cost of the root dominates
the total cost of the tree.
45
Example 2 – By substitution method
T(n) = 3T(n/4) + cn2
 Guess: T(n) = O(n2)
 Induction goal: T(n) ≤ dn2, for some d and n ≥ n0
 Induction hypothesis: T(n/4) ≤ d (n/4)2
 Proof of induction goal:
T(n) = 3 T(n/4) + cn2
≤ 3d (n/4)2 + cn2
= (3/16) d n2 + cn2 ≤ d n2 if: d ≥ (16/13)c
 Therefore: T(n) = O(n2)
46
Instructor notes:
c ≤ d-(3/16) d
c ≤ d(1 - 3/16)
c ≤ d(13/16)
C(16/13) ≤ d
Example 3
W(n) = 2W(n/2) + n2
 Subproblem size at level i is: n/2i
 Subproblem size hits 1 when 1 = n/2i  i = lgn
 Cost of the problem at level i = (n/2i)2 No. of nodes at level i = 2i
 Total cost:
 W(n) = O(n2)
2
2
0
2
1
lg
0
2
lg
1
lg
0
2
2
)
(
2
1
1
1
)
(
2
1
2
1
)
1
(
2
2
)
( n
n
O
n
n
O
n
n
n
W
n
n
W
i
i
n
i
i
n
n
i
i





















 







 47
47
T(n) = 2T(n/2) + n2
initial guess T(n)=O(n2)
T(n)<=d n2 )
Inductive hypo: T(n/2)<=d(n/2)^2
Proof
T(n) = 2T(n/2) + n2<=2dn^2/4+n^2
=dn^2/2+n^2= n^2(d/2+1)<=d(n/2)^2
…
n2d/2+ n2<=dn2 /4
d/2+1<=d/4
d/2-d/4<=-1
d/4<=-1
d<=-4
d>=4
48
Example 3 (using guess in the substitution
method)
Example 4
Solve T(n) = T(n/4) + T(n/2) + n2:
49
T(n)
Solve T(n) = T(n/4) + T(n/2) + n2:
Example 3
50
T(n/4) T(n/2)
n2
Solve T(n) = T(n/4) + T(n/2) + n2:
Example 3
51
Solve T(n) = T(n/4) + T(n/2) + n2:
n2
(n/4)2 (n/2)2
T(n/16) T(n/8) T(n/8) T(n/4)
Example 3
52
(n/16)2 (n/8)2 (n/8)2 (n/4)2
(n/4)2 (n/2)2
(1)
Solve T(n) = T(n/4) + T(n/2) + n2:
n2
Example 3
53
Solve T(n) = T(n/4) + T(n/2) + n2:
(n/16)2 (n/8)2 (n/8)2 (n/4)2
(n/4)2 (n/2)2
(1)
2
n
n2
Example 3
54
Solve T(n) = T(n/4) + T(n/2) + n2:
(n/16)2 (n/8)2 (n/8)2 (n/4)2
(n/4)2 (n/2)2
(1)
2
16
5 n
2
n
n2
Example 3
55
Solve T(n) = T(n/4) + T(n/2) + n2:
(n/16)2 (n/8)2 (n/8)2 (n/4)2
(n/4)2
(1)
2
16
5 n
2
n
2
256
25 n
n2
(n/2)2
…
Example 3
56
Solve T(n) = T(n/4) + T(n/2) + n2:
(n/16)2 (n/8)2 (n/8)2 (n/4)2
(n/4)2
(1)
2
16
5 n
2
n
2
256
25 n
   
 
1
3
16
5
2
16
5
16
5
2





n
…
Total =
= (n2)
n2
(n/2)2
geometric series
Example 3
57
The master method for solving recurrences
58
Master’s method
 “Cookbook” for solving recurrences of the form:
where, a ≥ 1, b > 1, and f(n) > 0
Idea: compare f(n) with n^log
b
a
 f(n) is asymptotically smaller or larger than nlog
b
a by a polynomial factor n
 f(n) is asymptotically equal with nlog
b
a
)
(
)
( n
f
b
n
aT
n
T 







59
Master’s method
 “Cookbook” for solving recurrences of the form:
where, a ≥ 1, b > 1, and f(n) > 0
Case 1: if f(n) = O(nlog
b
a -) for some  > 0, then: T(n) = (nlog
b
a)
Case 2: if f(n) = (nlog
b
a), then: T(n) = (nlog
b
a lgn)
Case 3: if f(n) = (nlog
b
a +) for some  > 0, and if
af(n/b) ≤ cf(n) for some c < 1 and all sufficiently large n, then:
T(n) = (f(n))
)
(
)
( n
f
b
n
aT
n
T 







regularity condition
60
Why nlog
b
a? )
(
)
( n
f
b
n
aT
n
T 














b
n
aT
n
T )
(
• Assume n = bk  k = logbn
• At the end of iteration i = k:
   
a
n
n
i
i
n b
b
b
b
n
a
T
a
b
b
T
a
n
T log
log
log
log
)
1
(
)
( 



















2
2
b
n
T
a






3
3
b
n
T
a
𝑇(𝑛) = 𝑎𝑖
𝑇
𝑛
𝑏𝑖
∀𝑖
• Case 1:
– If f(n) is dominated by nlog
b
a:
• T(n) = (nlog
b
n)
• Case 3:
– If f(n) dominates nlog
b
a:
• T(n) = (f(n))
• Case 2:
– If f(n) = (nlog
b
a):
• T(n) = (nlog
b
a logn)
61
The Master Method
 provides a “cookbook” method for solving recurrences of the form
 This recurrence describes the running time of an algorithm that
divides a problem of size n into a subproblems, each of size n/b,
where a and b are positive constants.
 is a constant that shows in how many subproblems, the
original is divided.
 is a constant that shows the size of the subproblem.
 The a subproblems are solved recursively, each in time T(n/b).
 f(n) is an asymptotically positive function, encompasses the cost of
dividing the problem and combining the results of the subproblems.
 Note that may not be an integer, but the analysis does not
change for both or .
62
The Master Theorem
• Let a >= 1 and b > 1 be constants, let f(n) be a function, and let T(n)
be defined on the nonnegative integers by the recurrence:
T(n) = aT(n/b) + f(n)
• Then T(n) has the following asymptotic bounds:
• Three cases:
1.f (n) = O(nlog
b
a – ε) for some constant ε> 0,
• f (n) grows polynomially slower than nlogba (by an nε factor).
• Solution: T(n) = Θ (nlog
b
a).
2.f (n) = Θ (nlog
b
a ), then T(n) = Θ (nlogba lgn).
• the two functions: f(n) and nlog
b
a are the same size (grow at similar rates), we
multiply by a logarithmic factor
• Solution: T(n) = Θ (nlog
b
a lgn) = Θ (f(n) lgn).
3.f (n) = Ω(nlog
b
a + ε) for some constant ε> 0, and if af(n/b) <= cf(n) for some constant
c < 1 and all sufficiently large n.
• f (n) grows polynomially slower than nlogba (by an nε factor).
• Solution: T(n) = Θ (f(n)). 63
Example1:
Use the master method to solve the recurrence: T(n) = 2T(n/2) + n
Solution: a = 2, b = 2, log22 = 1
Compare nlog
2
2 =n; with f(n) = n
Case 2: f(n) = (n)
 T(n) = (nlgn)
64
Example2:
Use the master method to solve the recurrence: T(n) = 2T(n/2) + n2
Solution: a = 2, b = 2, log22 = 1
Compare nlog
2
2 =n with f(n) = n2
Case 3: f(n) = (nlog
b
a +)
f(n) = (n1+)
n2 = n1+
2=1+ꞒꞒ=2-1=1>0
verify regularity condition:
a f(n/b) ≤ c f(n)
 2 n2/4 ≤ c n2  c = ½ is a solution (c<1)
 T(n) = (n2)
65
Example3:
Use the master method to solve the recurrence: T(n) = 2T(n/2) +
Solution: a = 2, b = 2, log22 = 1
Compare nlog
2
2 =n with f(n) = n1/2 
Case 1: nlog
b
a – ε
1- 1/2 11/20.5>0
f(n) = O(n1-) ,  = 1-0.5 = 0.5
 T(n) = (n)
n
66
Use the master method to solve the recurrence T(n) = 9T(n/3) + n
Solution: We have
And
= n^2
Case 1:
 T(n) = (n2).
Example4:
67
Use the master method to solve the recurrence: T(n) = T(2n/3) + 1
T(n) = T(2n/3) + 1=T(n/(3/2))+1
Solution: We have
And
Since (3/2)^0=1
Case 2:
 T(n) = (log n).
Example5:
68
Use the master method to solve the recurrence: T(n) = 3T(n/4) + n lg n
Solution: We have
And
0.793+Ꞓ=1 Ꞓ=1-0.793~=0.207>0
Case 3:
verify regularity condition: af(n/b) ≤ cf(n)
af(n/b)=3f(n/4)=
Where:
 T(n) = (n log n)
Example6:
3 3 3
( / ) 3 lg (lg lg 4) lg
4 4 4 4 2
3 3
lg ( lg ) ( )
4 4
n n n n n
af n b n n
n
n n n cf n
    
  
3/ 4 1
c  
69
70
af(n/b)=3f(n/4)=3.n/4.lg(n/4)
=3n/4. (lgn – lg4)
= 3n/4. (lgn – 2)
=3nlg/4 – 3n/2< 3nlgn/4=cnlgn
c=3/4=0.75<1
Use the master method to solve the recurrence: T(n) = 2T(n/2) + n lg n
Solution: We have : a = 2, b = 2, log22 = 1
And Compare nlog
2
2 =n with f(n) = n lgn
seems like case 3 should apply, BUT
• f(n) must be polynomially larger by a factor of nε
• In this case it is only larger by a factor of lg n
• Master method does not apply.
• In particular, for every constant  > 0, we have n  w(lg n).
Example7:
71
Use the master method to solve the recurrence: T(n) = 4T(n/2) + n
Solution: We have a = 4, b = 2
And nlogba = n2; f (n) = n.
Case 1 : f(n) = O(n2 – ) for  = 1>0.
 T(n) = (n2)
Use the master method to solve the recurrence: T(n) = 4T(n/2) + n2
Solution: We have a = 4, b = 2
And nlogba = n2; f (n) = n2.
Case 2 : f(n) = (n2 )
 T(n) = (n2 logn)
Example8 - 9:
72
Use the master method to solve the recurrence: T(n) = 4T(n/2) + n3
Solution: We have a = 4, b = 2
And nlogba = n2; f (n) = n3.
Case 3: f(n) = Ω(n2 + ) for  = 1.
af(n/b) ≤ cf(n)  4(n/2)3  cn3 4n3/8  cn3 (reg. cond.) for c = ½<1.
 T(n) = (n3)
Use the master method to solve the recurrence: T(n) = 4T(n/2) + n2 /log n
Solution: We have a = 4, b = 2
And nlogba = n2; f (n) = n2 /log n
• seems like case 1 should apply, but f(n) must be polynomially smaller by a factor of nε
• In this case it is only smaller by a factor of lgn
• Master method does not apply.
Example10 - 11:
73
References
74
• T.H. Cormen, C.E. Leiserson, R.L. Rivest, C. Stein, “Introduction to
Algorithms”, The MIT Press, 2009, ISBN-10: 0262033844 | ISBN-13: 978-
0262033848, 3rd Edition  chapter 4.
• https://www.cse.unr.edu/~bebis/CS477/
• https://www.youtube.com/watch?v=8gt0D0IqU5w
• https://www.youtube.com/watch?v=whjt_N9uYFI
• https://www.youtube.com/watch?v=PSuNNYw1BNA&list=PL6EF0274BD84
9A7D5&index=5&t=13s

More Related Content

Similar to T2311 - Ch 4_Part1.pptx

5.2 divede and conquer 03
5.2 divede and conquer 035.2 divede and conquer 03
5.2 divede and conquer 03Krish_ver2
 
04_Recurrences_1.ppt
04_Recurrences_1.ppt04_Recurrences_1.ppt
04_Recurrences_1.pptMouDhara1
 
Divide and conquer
Divide and conquerDivide and conquer
Divide and conquerVikas Sharma
 
5.2 divide and conquer
5.2 divide and conquer5.2 divide and conquer
5.2 divide and conquerKrish_ver2
 
Analysis Of Algorithms Ii
Analysis Of Algorithms IiAnalysis Of Algorithms Ii
Analysis Of Algorithms IiSri Prasanna
 
Algorithm Design and Complexity - Course 3
Algorithm Design and Complexity - Course 3Algorithm Design and Complexity - Course 3
Algorithm Design and Complexity - Course 3Traian Rebedea
 
Divide and Conquer - Part 1
Divide and Conquer - Part 1Divide and Conquer - Part 1
Divide and Conquer - Part 1Amrinder Arora
 
pradeepbishtLecture13 div conq
pradeepbishtLecture13 div conqpradeepbishtLecture13 div conq
pradeepbishtLecture13 div conqPradeep Bisht
 
Admission in india 2015
Admission in india 2015Admission in india 2015
Admission in india 2015Edhole.com
 
Copy of y16 02-2119divide-and-conquer
Copy of y16 02-2119divide-and-conquerCopy of y16 02-2119divide-and-conquer
Copy of y16 02-2119divide-and-conquerJoepang2015
 
lecture 15
lecture 15lecture 15
lecture 15sajinsc
 
Quicksort analysis
Quicksort analysisQuicksort analysis
Quicksort analysisPremjeet Roy
 

Similar to T2311 - Ch 4_Part1.pptx (20)

5.2 divede and conquer 03
5.2 divede and conquer 035.2 divede and conquer 03
5.2 divede and conquer 03
 
04_Recurrences_1.ppt
04_Recurrences_1.ppt04_Recurrences_1.ppt
04_Recurrences_1.ppt
 
Divide and conquer
Divide and conquerDivide and conquer
Divide and conquer
 
03 dc
03 dc03 dc
03 dc
 
5.2 divide and conquer
5.2 divide and conquer5.2 divide and conquer
5.2 divide and conquer
 
L2
L2L2
L2
 
L2
L2L2
L2
 
Time complexity
Time complexityTime complexity
Time complexity
 
Analysis Of Algorithms Ii
Analysis Of Algorithms IiAnalysis Of Algorithms Ii
Analysis Of Algorithms Ii
 
Algorithm Design and Complexity - Course 3
Algorithm Design and Complexity - Course 3Algorithm Design and Complexity - Course 3
Algorithm Design and Complexity - Course 3
 
Daa chapter 2
Daa chapter 2Daa chapter 2
Daa chapter 2
 
Divide and Conquer - Part 1
Divide and Conquer - Part 1Divide and Conquer - Part 1
Divide and Conquer - Part 1
 
pradeepbishtLecture13 div conq
pradeepbishtLecture13 div conqpradeepbishtLecture13 div conq
pradeepbishtLecture13 div conq
 
Admission in india 2015
Admission in india 2015Admission in india 2015
Admission in india 2015
 
Merge Sort
Merge SortMerge Sort
Merge Sort
 
Copy of y16 02-2119divide-and-conquer
Copy of y16 02-2119divide-and-conquerCopy of y16 02-2119divide-and-conquer
Copy of y16 02-2119divide-and-conquer
 
lecture 15
lecture 15lecture 15
lecture 15
 
02 Notes Divide and Conquer
02 Notes Divide and Conquer02 Notes Divide and Conquer
02 Notes Divide and Conquer
 
Recurrence relation
Recurrence relationRecurrence relation
Recurrence relation
 
Quicksort analysis
Quicksort analysisQuicksort analysis
Quicksort analysis
 

Recently uploaded

Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...
Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...
Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...Arindam Chakraborty, Ph.D., P.E. (CA, TX)
 
Engineering Drawing focus on projection of planes
Engineering Drawing focus on projection of planesEngineering Drawing focus on projection of planes
Engineering Drawing focus on projection of planesRAJNEESHKUMAR341697
 
Introduction to Serverless with AWS Lambda
Introduction to Serverless with AWS LambdaIntroduction to Serverless with AWS Lambda
Introduction to Serverless with AWS LambdaOmar Fathy
 
Unit 4_Part 1 CSE2001 Exception Handling and Function Template and Class Temp...
Unit 4_Part 1 CSE2001 Exception Handling and Function Template and Class Temp...Unit 4_Part 1 CSE2001 Exception Handling and Function Template and Class Temp...
Unit 4_Part 1 CSE2001 Exception Handling and Function Template and Class Temp...drmkjayanthikannan
 
PE 459 LECTURE 2- natural gas basic concepts and properties
PE 459 LECTURE 2- natural gas basic concepts and propertiesPE 459 LECTURE 2- natural gas basic concepts and properties
PE 459 LECTURE 2- natural gas basic concepts and propertiessarkmank1
 
COST-EFFETIVE and Energy Efficient BUILDINGS ptx
COST-EFFETIVE  and Energy Efficient BUILDINGS ptxCOST-EFFETIVE  and Energy Efficient BUILDINGS ptx
COST-EFFETIVE and Energy Efficient BUILDINGS ptxJIT KUMAR GUPTA
 
S1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptx
S1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptxS1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptx
S1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptxSCMS School of Architecture
 
Work-Permit-Receiver-in-Saudi-Aramco.pptx
Work-Permit-Receiver-in-Saudi-Aramco.pptxWork-Permit-Receiver-in-Saudi-Aramco.pptx
Work-Permit-Receiver-in-Saudi-Aramco.pptxJuliansyahHarahap1
 
GEAR TRAIN- BASIC CONCEPTS AND WORKING PRINCIPLE
GEAR TRAIN- BASIC CONCEPTS AND WORKING PRINCIPLEGEAR TRAIN- BASIC CONCEPTS AND WORKING PRINCIPLE
GEAR TRAIN- BASIC CONCEPTS AND WORKING PRINCIPLEselvakumar948
 
Design For Accessibility: Getting it right from the start
Design For Accessibility: Getting it right from the startDesign For Accessibility: Getting it right from the start
Design For Accessibility: Getting it right from the startQuintin Balsdon
 
Verification of thevenin's theorem for BEEE Lab (1).pptx
Verification of thevenin's theorem for BEEE Lab (1).pptxVerification of thevenin's theorem for BEEE Lab (1).pptx
Verification of thevenin's theorem for BEEE Lab (1).pptxchumtiyababu
 
Computer Networks Basics of Network Devices
Computer Networks  Basics of Network DevicesComputer Networks  Basics of Network Devices
Computer Networks Basics of Network DevicesChandrakantDivate1
 
Thermal Engineering-R & A / C - unit - V
Thermal Engineering-R & A / C - unit - VThermal Engineering-R & A / C - unit - V
Thermal Engineering-R & A / C - unit - VDineshKumar4165
 
Tamil Call Girls Bhayandar WhatsApp +91-9930687706, Best Service
Tamil Call Girls Bhayandar WhatsApp +91-9930687706, Best ServiceTamil Call Girls Bhayandar WhatsApp +91-9930687706, Best Service
Tamil Call Girls Bhayandar WhatsApp +91-9930687706, Best Servicemeghakumariji156
 
data_management_and _data_science_cheat_sheet.pdf
data_management_and _data_science_cheat_sheet.pdfdata_management_and _data_science_cheat_sheet.pdf
data_management_and _data_science_cheat_sheet.pdfJiananWang21
 
DeepFakes presentation : brief idea of DeepFakes
DeepFakes presentation : brief idea of DeepFakesDeepFakes presentation : brief idea of DeepFakes
DeepFakes presentation : brief idea of DeepFakesMayuraD1
 

Recently uploaded (20)

Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...
Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...
Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...
 
Engineering Drawing focus on projection of planes
Engineering Drawing focus on projection of planesEngineering Drawing focus on projection of planes
Engineering Drawing focus on projection of planes
 
Introduction to Serverless with AWS Lambda
Introduction to Serverless with AWS LambdaIntroduction to Serverless with AWS Lambda
Introduction to Serverless with AWS Lambda
 
Unit 4_Part 1 CSE2001 Exception Handling and Function Template and Class Temp...
Unit 4_Part 1 CSE2001 Exception Handling and Function Template and Class Temp...Unit 4_Part 1 CSE2001 Exception Handling and Function Template and Class Temp...
Unit 4_Part 1 CSE2001 Exception Handling and Function Template and Class Temp...
 
PE 459 LECTURE 2- natural gas basic concepts and properties
PE 459 LECTURE 2- natural gas basic concepts and propertiesPE 459 LECTURE 2- natural gas basic concepts and properties
PE 459 LECTURE 2- natural gas basic concepts and properties
 
COST-EFFETIVE and Energy Efficient BUILDINGS ptx
COST-EFFETIVE  and Energy Efficient BUILDINGS ptxCOST-EFFETIVE  and Energy Efficient BUILDINGS ptx
COST-EFFETIVE and Energy Efficient BUILDINGS ptx
 
Integrated Test Rig For HTFE-25 - Neometrix
Integrated Test Rig For HTFE-25 - NeometrixIntegrated Test Rig For HTFE-25 - Neometrix
Integrated Test Rig For HTFE-25 - Neometrix
 
S1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptx
S1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptxS1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptx
S1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptx
 
Work-Permit-Receiver-in-Saudi-Aramco.pptx
Work-Permit-Receiver-in-Saudi-Aramco.pptxWork-Permit-Receiver-in-Saudi-Aramco.pptx
Work-Permit-Receiver-in-Saudi-Aramco.pptx
 
GEAR TRAIN- BASIC CONCEPTS AND WORKING PRINCIPLE
GEAR TRAIN- BASIC CONCEPTS AND WORKING PRINCIPLEGEAR TRAIN- BASIC CONCEPTS AND WORKING PRINCIPLE
GEAR TRAIN- BASIC CONCEPTS AND WORKING PRINCIPLE
 
Call Girls in South Ex (delhi) call me [🔝9953056974🔝] escort service 24X7
Call Girls in South Ex (delhi) call me [🔝9953056974🔝] escort service 24X7Call Girls in South Ex (delhi) call me [🔝9953056974🔝] escort service 24X7
Call Girls in South Ex (delhi) call me [🔝9953056974🔝] escort service 24X7
 
Design For Accessibility: Getting it right from the start
Design For Accessibility: Getting it right from the startDesign For Accessibility: Getting it right from the start
Design For Accessibility: Getting it right from the start
 
Verification of thevenin's theorem for BEEE Lab (1).pptx
Verification of thevenin's theorem for BEEE Lab (1).pptxVerification of thevenin's theorem for BEEE Lab (1).pptx
Verification of thevenin's theorem for BEEE Lab (1).pptx
 
Cara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak Hamil
Cara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak HamilCara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak Hamil
Cara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak Hamil
 
FEA Based Level 3 Assessment of Deformed Tanks with Fluid Induced Loads
FEA Based Level 3 Assessment of Deformed Tanks with Fluid Induced LoadsFEA Based Level 3 Assessment of Deformed Tanks with Fluid Induced Loads
FEA Based Level 3 Assessment of Deformed Tanks with Fluid Induced Loads
 
Computer Networks Basics of Network Devices
Computer Networks  Basics of Network DevicesComputer Networks  Basics of Network Devices
Computer Networks Basics of Network Devices
 
Thermal Engineering-R & A / C - unit - V
Thermal Engineering-R & A / C - unit - VThermal Engineering-R & A / C - unit - V
Thermal Engineering-R & A / C - unit - V
 
Tamil Call Girls Bhayandar WhatsApp +91-9930687706, Best Service
Tamil Call Girls Bhayandar WhatsApp +91-9930687706, Best ServiceTamil Call Girls Bhayandar WhatsApp +91-9930687706, Best Service
Tamil Call Girls Bhayandar WhatsApp +91-9930687706, Best Service
 
data_management_and _data_science_cheat_sheet.pdf
data_management_and _data_science_cheat_sheet.pdfdata_management_and _data_science_cheat_sheet.pdf
data_management_and _data_science_cheat_sheet.pdf
 
DeepFakes presentation : brief idea of DeepFakes
DeepFakes presentation : brief idea of DeepFakesDeepFakes presentation : brief idea of DeepFakes
DeepFakes presentation : brief idea of DeepFakes
 

T2311 - Ch 4_Part1.pptx

  • 1. Imam Abdulrahman Bin Faisal University College of Computer Science and Information Technology Department of Computer Science CS 412 Algorithm Analysis and Design Term 1 2022-2023 Chapter 4 – Part # 1 Divide-and-Conquer
  • 2. Outline  Introduction  Recurrences and Running Time  The divide-and-conquer design paradigm  Solving Recurrences:  Iteration method  Substitution method  Recursion tree method  Master Theorem/Method 2
  • 3. Introduction  We saw how merge sort serves as an example of the divide-and- conquer paradigm.  The analysis of merge sort from previous lecture required us to solve a recurrence. T(n)=2T(n/2)+cn  In this chapter, we shall see more algorithms based on divide-and- conquer.  Binary search.  Two divide-and-conquer algorithms for multiplying n*n matrices.  Then we will discuss different ways to solve Recurrences.  Iteration method  Substitution method  Recursion tree method  Master method 3
  • 4. The divide-and-conquer design paradigm 1. Divide the problem (instance) into subproblems. 2. Conquer the subproblems by solving them recursively. 3. Combine subproblem solutions. 4
  • 5. Merge sort 1. Divide: Trivial. 2. Conquer: Recursively sort 2 subarrays. 3. Combine: Linear-time merge. 5
  • 6. Merge sort T(n) = 2 T(n/2) + (n) # subproblems subproblem size work dividing and combining 1. Divide: Trivial. 2. Conquer: Recursively sort 2 subarrays. 3. Combine: Linear-time merge. 6
  • 7. Binary search Example: Find 9 3 5 7 8 9 12 15 Find an element in a sorted array: 1. Divide: Check middle element. 2. Conquer: Recursively search 1 subarray. 3. Combine: Trivial. 7
  • 8. Binary search Find an element in a sorted array: 1. Divide: Check middle element. 2. Conquer: Recursively search 1 subarray. 3. Combine: Trivial. 3 5 7 8 9 12 15 Example: Find 9 8
  • 9. Binary search Find an element in a sorted array: 1. Divide: Check middle element. 2. Conquer: Recursively search 1 subarray. 3. Combine: Trivial. Example: Find 9 3 5 7 8 9 12 15 9
  • 10. Binary search Find an element in a sorted array: 1. Divide: Check middle element. 2. Conquer: Recursively search 1 subarray. 3. Combine: Trivial. Example: Find 9 3 5 7 8 9 12 15 10
  • 11. Binary search Find an element in a sorted array: 1. Divide: Check middle element. 2. Conquer: Recursively search 1 subarray. 3. Combine: Trivial. Example: Find 9 3 5 7 8 9 12 15 11
  • 12. Binary search Find an element in a sorted array: 1. Divide: Check middle element. 2. Conquer: Recursively search 1 subarray. 3. Combine: Trivial. Example: Find 9 3 5 7 8 9 12 15 12
  • 13.  for an ordered array A, finds if x is in the array A[lo…hi] Alg.: BINARY-SEARCH (A, lo, hi, x) if (lo > hi) return FALSE mid  (lo+hi)/2 if x = A[mid] return TRUE if ( x < A[mid] ) BINARY-SEARCH (A, lo, mid-1, x) if ( x > A[mid] ) BINARY-SEARCH (A, mid+1, hi, x) 12 11 10 9 7 5 3 2 1 2 3 4 5 6 7 8 mid lo hi Binary search 13
  • 14. Example  A[8] = {1, 2, 3, 4, 5, 7, 9, 11} lo = 1 hi = 8 x = 7 mid = 4, lo = 5, hi = 8 mid = 6, A[mid] = x Found! 11 9 7 5 4 3 2 1 11 9 7 5 4 3 2 1 1 2 3 4 5 6 7 8 8 7 6 5 14
  • 15. Another Example  A[8] = {1, 2, 3, 4, 5, 7, 9, 11} lo = 1 hi = 8 x = 6 mid = 4, lo = 5, hi = 8 mid = 6, A[6] = 7, lo = 5, hi = 5 11 9 7 5 4 3 2 1 11 9 7 5 4 3 2 1 1 2 3 4 5 6 7 8 11 9 7 5 4 3 2 1 mid = 5, A[5] = 5, lo = 6, hi = 5 NOT FOUND! 11 9 7 5 4 3 2 1 low high low low high high 15
  • 16. Analysis of BINARY-SEARCH Alg.: BINARY-SEARCH (A, lo, hi, x) if (lo > hi) return FALSE mid  (lo+hi)/2 if x = A[mid] return TRUE if ( x < A[mid] ) BINARY-SEARCH (A, lo, mid-1, x) if ( x > A[mid] ) BINARY-SEARCH (A, mid+1, hi, x)  T(n) = c + • T(n) – running time for an array of size n constant time: c2 same problem of size n/2 same problem of size n/2 constant time: c1 constant time: c3 T(n/2) 16
  • 17. Recurrence for binary search T(n) = 1 T(n/2) + Q(1) N=4 # subproblems subproblem size work dividing and combining 17 T(n) = 1 T(n/2) + Q(1) n=4 T(4)=T(2)+Q(1) T(4)=T(1)+Q(1) +Q(1) T(4)=2Q(1)+T(1)
  • 18. Recurrences and Running Time  Recurrences go hand in hand with the divide-and-conquer paradigm, because they give us a natural way to characterize the running times of divide-and-conquer algorithms.  Recurrences arise when an algorithm contains recursive calls to itself  It is an equation or inequality that describes a function in terms of its value on smaller inputs.  Recurrences can take many forms. o A recursive algorithm might divide subproblems into unequal sizes, such as a 2/3-to-1/3 split. If the divide and combine steps take linear time, such an algorithm would give rise to the recurrence T(n) = T(2n/3) + T(n/3) + Θ(n). o Subproblems are not necessarily constrained to being a constant fraction of the original problem size. For example, a recursive version of linear search yields the recurrence T(n) = T(n-1) + Θ(1) 18
  • 19. Example Recurrences  T(n) = T(n-1) + n Θ(n2) o Recursive algorithm that loops through the input to eliminate one item  T(n) = T(n/2) + c Θ(log n) o Recursive algorithm that halves the input in one step  T(n) = T(n/2) + n Θ(nlgn) o Recursive algorithm that halves the input but must examine every item in the input  T(n) = 2T(n/2) + c Θ(lgn) o Recursive algorithm that splits the input into 2 halves and does a constant amount of other work 19
  • 20. Recurrences and Running Time  What is the actual running time of the algorithm?  Need to solve the recurrence (obtaining asymptotic “Θ” or “O” bounds on the solution) o Find an explicit formula of the expression o Bound the recurrence by an expression that involves n 20
  • 21. Methods for Solving the recurrence 21  This chapter offers four methods for solving recurrences 1. The iteration method  Convert the recurrence into a summation and try to bound it using known series 2. The substitution method  Guess a bound and then use mathematical induction to prove our guess correct. 3. The recursion-tree method  converts the recurrence into a tree whose nodes represent the costs incurred at various levels of the recursion. We use techniques for bounding summations to solve the recurrence. 4. The master method  provides bounds for recurrences of the form : T(n) = aT(n/b) + f(n). where a >= 1, b > 1, and f(n) is a given function.
  • 22. The Iteration method for solving recurrences 22
  • 23. The Iteration Method  Convert the recurrence into a summation and try to bound it using known series o Iterate the recurrence until the initial condition is reached. o Use back-substitution to express the recurrence in terms of n and the initial (boundary) condition. 23
  • 24. T(n) = c + T(n/2) T(n) = c + T(n/2) = c + c + T(n/ 22) = c + c + c + T(n/ 23) ........ = c + c + c+…+ T(n/ 2k) Assume n = 2k --- as base case when n / 2k =1 n = 2k log n = k T(n) = c + c + … + c + T(1) T(n)=c k + T(1) = c lgn + T(1) ; k=log n = Θ(lgn) k times T(n/2) = c + T(n/ 22) T(n/4) = c + T(n/ 23) 24 Iteration Method – Example T(n) = c if n=1 c + T(n/2) if n>1
  • 25. Iteration Method – Example T(n) = n + 2T(n/2) T(n) = n + 2 T(n/2) = n + 2(n/2 + 2T(n/ 22)) = n + n + 22 T(n/ 22) = n + n + 22(n/4 + 2T(n/ 23)) = n + n + n + 23 T(n/ 23) ………………… = kn + 2kT(n/2k) -- Assume n = 2k --- as base case when n / 2k =1 n = 2k log n = k = kn + 2kT(1) = nlgn + nT(1) = Θ(n lgn) T(n/2) = n/2 + 2T(n/22) 25 T(n) = c if n=1 n + 2T(n/2) if n>1 T(n/22) = n/22 + 2T(n/23)
  • 26. Example T(n) = T(n-1) + n  T(n) = T(n-1) + n  T(n) = T(n-2)+n+n  T(n) = T(n-3)+n+n+n  T(n) = T(n-k)+kn n-k=1 n=1+k k=n-1 T(n) = T(1)+(n-1)n = n^2 – n + T(1) = theta (n^2) 26
  • 27. Example T(n) = T(n-2) + c T(n) = T(n-1) + c T(n) = T(n-2) + c + c T(n) = T(n-3) + c + c + c … T(n) = T(n-k) + kc n-k=1 k = n-1 T(n) = T(1) + (n-1) c T(n) = Θ(n) 27 T(n)=T(n-2)+c T(n)=T(n-2.2)+c+c T(n)=T(n-2.3)+c+c+c T(n)=T(n-2.k)+kc n-2k=1 2k=n k=n/2 T(n)=T(1)+cn/2 T(n) = Θ(n)
  • 28. The substitution method for solving recurrences 28
  • 29. Substitution method  Now that we have seen how recurrences characterize the running times of divide-and-conquer algorithms, we will learn how to solve recurrences using the “substitution” method.  Its the most general method.  comprises two steps: 1. Guess the form of the solution. 2. Verify by mathematical induction to find the constants and show that the solution works.  We substitute the guessed solution for the function when applying the inductive hypothesis to smaller values; hence the name “substitution method.”  This method is powerful, but we must be able to guess the form of the answer in order to apply it.  We can use the substitution method to establish either upper (big-Oh) or lower (big-Omega) bounds on a recurrence 29
  • 30. Substitution method 1. Guess a solution  T(n) = O(g(n))  Induction goal: apply the definition of the asymptotic notation  T(n) ≤ d g(n), for some d > 0 and n ≥ n0  Induction hypothesis: T(k) ≤ d g(k) for all k < n 2. Prove the induction goal  Use the induction hypothesis to find some values of the constants d and n0 for which the induction goal holds (strong induction) 30
  • 31. Example 1: Merge sort T(n) = 2T(n/2) + n  Guess: T(n) = O(nlgn) ; g(n)=nlgn  Induction goal: T(n) ≤ c n lgn, for some c and n ≥ n0  Induction hypothesis: T(n/2) ≤ c n/2 lg(n/2)  Proof of induction goal: T(n) = 2T(n/2) + n ≤ 2c (n/2)lg(n/2) + n T(n) ≤ 2( c.(n/2) log.(n/2) ) + n ≤ cn log.(n/2) + n = cn log n - cn log 2 + n = cn lgn – cn + n = cn lgn - cn lgn – n(c-1) ≤0 if: - cn + n ≤ 0  n(c-1) ≥ 0  c ≥ 1;n ≥ 0  Base case? Next page We guess that the solution is T(n) = O(n log n) The substitution method requires us to prove that T(n) = O(n log n) for an appropriate choice of the constant c > 0. We start by assuming that this bound holds for all positive m < n, in particular for m = (n/2) Substituting: T(n/2) <= c.(n/2) log.(n/2) into the recurrence 31
  • 32. T(n) = 2T(n/2) + n  T(n) = 2 T(n/2) + n  T(n) ≤ 2 c n/2 lg(n/2) + n = cn lg(n/2)+n = cn (lgn-lg2) + n =cn (lgn-1) +n =cnlgn – cn +n  T(n) ≤ cn lgn – n(c-1) ≤ cn lgn -n(c-1) ≤0 n (c-1) ≥ 0; n ≥0; c ≥1 32
  • 33. Example 1: Merge sort T(n) = 2T(n/2) + n  Guess: T(n) = O(nlgn)  Induction goal: T(n) ≤ c n lgn, for some c and n ≥ n0  Induction hypothesis: T(n/2) ≤ cn/2 lg(n/2)  Proof of induction goal: T(n) = 2T(n/2) + n ≤ 2c (n/2)lg(n/2) + n T(n) ≤ 2( c.(n/2) log.(n/2) ) + n ≤ cn log.(n/2) + n = cn log n - cn log 2 + n = cn lgn – cn + n ≤ cn lgn if: - cn + n ≤ 0  c ≥ 1  Base case? n=2 • Mathematical induction requires us to show that our solution holds for the boundary conditions. • We do so by showing that the boundary conditions are suitable as base cases for the inductive proof. • For this recurrence, we must show that we can choose the constant c large enough so that the bound: T(n) <= cn log n works for the boundary conditions as well. • BUT: T(1) = 1 is the sole boundary condition of the recurrence. Then for n = 1, the bound T(n) <= cn log n yields T(1) <= c log 1 = 0 • So, we take advantage of asymptotic notation requiring us only to prove T(n) <= cn log n for n >= n0 where n0 is a constant that we get to choose. • Extend boundary conditions to make the inductive assumption work for small n . • T(2) = 4 • we can complete the inductive proof that T(n) <= cn log n for some constant c >= 1 by choosing c large enough so that T(2) <= c2 log 2 • So, any choice of c >= 2 suffices for the base cases of n = 2 33
  • 34. Example 2: Binary Search T(n) = T(n/2) + c  Guess: T(n) = O(lgn)  Induction goal: T(n) ≤ d lgn, for some d and n ≥ n0  Induction hypothesis: T(n/2) ≤ d lg(n/2)  Proof of induction goal: T(n) = T(n/2) + c ≤ d lg(n/2) + c = d lgn – d lg2 + c = d lgn – d + c ≤ d lgn if: – d + c ≤ 0, d ≥ c  Base case? n=2 34
  • 35. Example 3 T(n) = T(n-1) + n  Guess: T(n) = O(n2)  Induction goal: T(n) ≤ c n2, for some c and n ≥ n0  Induction hypothesis: T(n-1) ≤ c(n-1)2 for all k < n  Proof of induction goal: T(n) = T(n-1) + n ≤ c (n-1)2 + n = c(n2-2n+1) + n = cn2 – (2cn – c - n) ≤ cn2 (see next page) if: 2cn – c – n ≥ 0  c ≥ n/(2n-1)  c ≥ 1/(2 – 1/n)  For n ≥ 1  2 – 1/n ≥ 1  any c ≥ 1 will work 35
  • 36. Detail of Example 3  – (2cn – c - n) ≤0  2cn – c - n ≥ 0  c(2n-1)-n ≥ 0  c(2n-1) ≥ n  c ≥ n/(2n-1) ≥ 1/(2-1/n)  n ≥ 1; c ≥ 1 36
  • 37. Changing variables  Rename: m = lgn  n = 2m  𝑛=n^(1/2)=2^m/2 T (2m) = 2T(2m/2) + m  Rename: S(m) = T(2m) S(m) = 2S(m/2) + m  S(m) = O(mlgm) (demonstrated before) T(n) = T(2m) = S(m) = O(mlgm)=O(lgn lg lgn)  Guess: S(m) = O(lgn lg lgn)  Induction goal: S(n) ≤ c (lgn lglgn) , for some c and n ≥ n0  Induction hypothesis: S(n/2) ≤ c lg(n/2) lg lg(n/2)  Proof of induction goal: S(n) = 2S(2n/2) + n ≤ c lg(n/2) lg lg(n/2) + n Idea: transform the recurrence to one that you have seen before T(n) = 2T( ) + lgn 𝑛 37
  • 38. The recursion-tree method for solving recurrences 38
  • 39. The recursion-tree method Convert the recurrence into a tree:  Each node represents the cost incurred at various levels of recursion  Sum up the costs of all levels  The recursion tree method is good for generating guesses for the substitution method. 39
  • 40. 40 Example 1 E.g.: T(n) = 8T(n/2) + n2 = n2 (2logn – 1) + n3 = n2 (nlog2 -1) + n3 = n2 (n -1) + n3 = n3 - n2 + n3 T(n) = (n3) 8i T(n/2i) T(n/2) = 8 T(n/22) + (n/2)2 T(n/22) = 8 T(n/23) + (n/2)3 T(n) = 1 n=1 8T(n/2) + n2 n>1 1. Iteration method:
  • 41. 41 Example 1 = n2 (2logn – 1) + n3 = n2 (nlog2 -1) + n3 = n2 (n -1) + n3 = n3 - n2 + n3 T(n) = (n3) E.g.: T(n) = 8T(n/2) + n2 2. Recursion Tree method:
  • 42. Example 2 E.g.: T(n) = 3T(n/4) + cn2 42 T(n) = 1 n=1 3T(n/4) + cn2 n>1 T(n)=T(n/4)+ T(n/4)+ T(n/4)+cn^2=3T(n/4)+cn^2 T(n/4)=T(n/16)+ T(n/16)+ T(n/16)+c(n/4)^2=3T(n/16)+c(n/4)^2
  • 43. Example 2 E.g.: T(n) = 3T(n/4) + cn2 𝑇 𝑛 = 𝑖=0 log4 𝑛−1 3 16 𝑖 𝑐𝑛2 + Θ 𝑛log4 3 ≤ 𝑖=0 ∞ 3 16 𝑖 𝑐𝑛2 + Θ 𝑛log4 3 = 1 1 − 3 16 𝑐𝑛2 + Θ 𝑛log4 3 = 𝑂(𝑛2) • Subproblem size at level i is: n/4i • Subproblem size hits 1 when 1 = n/4i  i = log4n  last level (H) H = log4n • Cost at base level (level H):  last level (H) has 3log 4 n = nlog 4 3 nodes each contributing cost T(1).  So cost = T(1).n log 4 3 = (nlog 4 3 ). • Number of nodes at level i = 3i • Cost of one problem (single node) at level i = c(n/4i)2 = (1/16i) cn2 • Cost of the problem (all nodes) at level i : 0, 1, 2; …., log4(n – 1) = 3i(n/4i)2 cn2 = (3/16)i cn2 • Total cost:  T(n) = O(n2) 43
  • 44. • This last formula looks somewhat messy . • We can again use an infinite decreasing geometric series as an upper bound. • equation (A.6), 44 Instructor Notes: T(n)=cn^2+(3/16)cn2+(3/16)^2cn2+…. T(n)=cn^2[1+(3/16)+(3/16)^2+(3/16)^3+….] Geometric Progression r=(3/16) 𝑖=0 ∞ 𝑟 𝑖=1/1-r;|r|<1 T(n)=cn^2[1/(1-3/16)] T(n)=cn^2(16/13)=θ(n^2)
  • 45. Example 2 Explanation  Because subproblem sizes decrease by a factor of 4 each time we go down one level, we eventually must reach a boundary condition.  How far from the root do we reach one?  The subproblem size for a node at depth i is n/4i . Thus, the subproblem size hits n = 1 when n=4i = 1 or, equivalently, when i = log4 n.  Thus, the tree has log4 n + 1 levels (at depths 0; 1; 2; : : : ; log4 n).  Next, we determine the cost at each level ..  Each level has three times more nodes than the level above, and so the number of nodes at depth i is 3i .  Because subproblem sizes reduce by a factor of 4 for each level we go down from the root, each node at depth i, for i = 0; 1; 2; : : : ; log4 n -1, has a cost of c(n=4i)2. Multiplying, we see that the total cost over all nodes at depth i, for i = 0; 1; 2; : : : ; log4 n - 1, is 3ic(n=4i)2 = (3/16)i cn2  The bottom level, at depth log4n, has 3 log 4 n = n log 4 3 = nodes, each contributing cost T(1), for a total cost of T(1) . n log 4 3 which is ( n log 4 3 )since we assume that T(1) is a constant.  Now we add up the costs over all levels to determine the cost for the entire tree:  Thus, we have derived a guess of T(n) = O(n2) for our original recurrence. In this example, the coefficients of cn2 form a decreasing geometric series and the sum of these coefficients is bounded from above by the constant 16/13 (see Eq. A.6 on previous slide).  Since the root’s contribution to the total cost is cn2, the root contributes a constant fraction of the total cost. In other words, the cost of the root dominates the total cost of the tree. 45
  • 46. Example 2 – By substitution method T(n) = 3T(n/4) + cn2  Guess: T(n) = O(n2)  Induction goal: T(n) ≤ dn2, for some d and n ≥ n0  Induction hypothesis: T(n/4) ≤ d (n/4)2  Proof of induction goal: T(n) = 3 T(n/4) + cn2 ≤ 3d (n/4)2 + cn2 = (3/16) d n2 + cn2 ≤ d n2 if: d ≥ (16/13)c  Therefore: T(n) = O(n2) 46 Instructor notes: c ≤ d-(3/16) d c ≤ d(1 - 3/16) c ≤ d(13/16) C(16/13) ≤ d
  • 47. Example 3 W(n) = 2W(n/2) + n2  Subproblem size at level i is: n/2i  Subproblem size hits 1 when 1 = n/2i  i = lgn  Cost of the problem at level i = (n/2i)2 No. of nodes at level i = 2i  Total cost:  W(n) = O(n2) 2 2 0 2 1 lg 0 2 lg 1 lg 0 2 2 ) ( 2 1 1 1 ) ( 2 1 2 1 ) 1 ( 2 2 ) ( n n O n n O n n n W n n W i i n i i n n i i                                47 47
  • 48. T(n) = 2T(n/2) + n2 initial guess T(n)=O(n2) T(n)<=d n2 ) Inductive hypo: T(n/2)<=d(n/2)^2 Proof T(n) = 2T(n/2) + n2<=2dn^2/4+n^2 =dn^2/2+n^2= n^2(d/2+1)<=d(n/2)^2 … n2d/2+ n2<=dn2 /4 d/2+1<=d/4 d/2-d/4<=-1 d/4<=-1 d<=-4 d>=4 48 Example 3 (using guess in the substitution method)
  • 49. Example 4 Solve T(n) = T(n/4) + T(n/2) + n2: 49
  • 50. T(n) Solve T(n) = T(n/4) + T(n/2) + n2: Example 3 50
  • 51. T(n/4) T(n/2) n2 Solve T(n) = T(n/4) + T(n/2) + n2: Example 3 51
  • 52. Solve T(n) = T(n/4) + T(n/2) + n2: n2 (n/4)2 (n/2)2 T(n/16) T(n/8) T(n/8) T(n/4) Example 3 52
  • 53. (n/16)2 (n/8)2 (n/8)2 (n/4)2 (n/4)2 (n/2)2 (1) Solve T(n) = T(n/4) + T(n/2) + n2: n2 Example 3 53
  • 54. Solve T(n) = T(n/4) + T(n/2) + n2: (n/16)2 (n/8)2 (n/8)2 (n/4)2 (n/4)2 (n/2)2 (1) 2 n n2 Example 3 54
  • 55. Solve T(n) = T(n/4) + T(n/2) + n2: (n/16)2 (n/8)2 (n/8)2 (n/4)2 (n/4)2 (n/2)2 (1) 2 16 5 n 2 n n2 Example 3 55
  • 56. Solve T(n) = T(n/4) + T(n/2) + n2: (n/16)2 (n/8)2 (n/8)2 (n/4)2 (n/4)2 (1) 2 16 5 n 2 n 2 256 25 n n2 (n/2)2 … Example 3 56
  • 57. Solve T(n) = T(n/4) + T(n/2) + n2: (n/16)2 (n/8)2 (n/8)2 (n/4)2 (n/4)2 (1) 2 16 5 n 2 n 2 256 25 n       1 3 16 5 2 16 5 16 5 2      n … Total = = (n2) n2 (n/2)2 geometric series Example 3 57
  • 58. The master method for solving recurrences 58
  • 59. Master’s method  “Cookbook” for solving recurrences of the form: where, a ≥ 1, b > 1, and f(n) > 0 Idea: compare f(n) with n^log b a  f(n) is asymptotically smaller or larger than nlog b a by a polynomial factor n  f(n) is asymptotically equal with nlog b a ) ( ) ( n f b n aT n T         59
  • 60. Master’s method  “Cookbook” for solving recurrences of the form: where, a ≥ 1, b > 1, and f(n) > 0 Case 1: if f(n) = O(nlog b a -) for some  > 0, then: T(n) = (nlog b a) Case 2: if f(n) = (nlog b a), then: T(n) = (nlog b a lgn) Case 3: if f(n) = (nlog b a +) for some  > 0, and if af(n/b) ≤ cf(n) for some c < 1 and all sufficiently large n, then: T(n) = (f(n)) ) ( ) ( n f b n aT n T         regularity condition 60
  • 61. Why nlog b a? ) ( ) ( n f b n aT n T                b n aT n T ) ( • Assume n = bk  k = logbn • At the end of iteration i = k:     a n n i i n b b b b n a T a b b T a n T log log log log ) 1 ( ) (                     2 2 b n T a       3 3 b n T a 𝑇(𝑛) = 𝑎𝑖 𝑇 𝑛 𝑏𝑖 ∀𝑖 • Case 1: – If f(n) is dominated by nlog b a: • T(n) = (nlog b n) • Case 3: – If f(n) dominates nlog b a: • T(n) = (f(n)) • Case 2: – If f(n) = (nlog b a): • T(n) = (nlog b a logn) 61
  • 62. The Master Method  provides a “cookbook” method for solving recurrences of the form  This recurrence describes the running time of an algorithm that divides a problem of size n into a subproblems, each of size n/b, where a and b are positive constants.  is a constant that shows in how many subproblems, the original is divided.  is a constant that shows the size of the subproblem.  The a subproblems are solved recursively, each in time T(n/b).  f(n) is an asymptotically positive function, encompasses the cost of dividing the problem and combining the results of the subproblems.  Note that may not be an integer, but the analysis does not change for both or . 62
  • 63. The Master Theorem • Let a >= 1 and b > 1 be constants, let f(n) be a function, and let T(n) be defined on the nonnegative integers by the recurrence: T(n) = aT(n/b) + f(n) • Then T(n) has the following asymptotic bounds: • Three cases: 1.f (n) = O(nlog b a – ε) for some constant ε> 0, • f (n) grows polynomially slower than nlogba (by an nε factor). • Solution: T(n) = Θ (nlog b a). 2.f (n) = Θ (nlog b a ), then T(n) = Θ (nlogba lgn). • the two functions: f(n) and nlog b a are the same size (grow at similar rates), we multiply by a logarithmic factor • Solution: T(n) = Θ (nlog b a lgn) = Θ (f(n) lgn). 3.f (n) = Ω(nlog b a + ε) for some constant ε> 0, and if af(n/b) <= cf(n) for some constant c < 1 and all sufficiently large n. • f (n) grows polynomially slower than nlogba (by an nε factor). • Solution: T(n) = Θ (f(n)). 63
  • 64. Example1: Use the master method to solve the recurrence: T(n) = 2T(n/2) + n Solution: a = 2, b = 2, log22 = 1 Compare nlog 2 2 =n; with f(n) = n Case 2: f(n) = (n)  T(n) = (nlgn) 64
  • 65. Example2: Use the master method to solve the recurrence: T(n) = 2T(n/2) + n2 Solution: a = 2, b = 2, log22 = 1 Compare nlog 2 2 =n with f(n) = n2 Case 3: f(n) = (nlog b a +) f(n) = (n1+) n2 = n1+ 2=1+ꞒꞒ=2-1=1>0 verify regularity condition: a f(n/b) ≤ c f(n)  2 n2/4 ≤ c n2  c = ½ is a solution (c<1)  T(n) = (n2) 65
  • 66. Example3: Use the master method to solve the recurrence: T(n) = 2T(n/2) + Solution: a = 2, b = 2, log22 = 1 Compare nlog 2 2 =n with f(n) = n1/2  Case 1: nlog b a – ε 1- 1/2 11/20.5>0 f(n) = O(n1-) ,  = 1-0.5 = 0.5  T(n) = (n) n 66
  • 67. Use the master method to solve the recurrence T(n) = 9T(n/3) + n Solution: We have And = n^2 Case 1:  T(n) = (n2). Example4: 67
  • 68. Use the master method to solve the recurrence: T(n) = T(2n/3) + 1 T(n) = T(2n/3) + 1=T(n/(3/2))+1 Solution: We have And Since (3/2)^0=1 Case 2:  T(n) = (log n). Example5: 68
  • 69. Use the master method to solve the recurrence: T(n) = 3T(n/4) + n lg n Solution: We have And 0.793+Ꞓ=1 Ꞓ=1-0.793~=0.207>0 Case 3: verify regularity condition: af(n/b) ≤ cf(n) af(n/b)=3f(n/4)= Where:  T(n) = (n log n) Example6: 3 3 3 ( / ) 3 lg (lg lg 4) lg 4 4 4 4 2 3 3 lg ( lg ) ( ) 4 4 n n n n n af n b n n n n n n cf n         3/ 4 1 c   69
  • 70. 70 af(n/b)=3f(n/4)=3.n/4.lg(n/4) =3n/4. (lgn – lg4) = 3n/4. (lgn – 2) =3nlg/4 – 3n/2< 3nlgn/4=cnlgn c=3/4=0.75<1
  • 71. Use the master method to solve the recurrence: T(n) = 2T(n/2) + n lg n Solution: We have : a = 2, b = 2, log22 = 1 And Compare nlog 2 2 =n with f(n) = n lgn seems like case 3 should apply, BUT • f(n) must be polynomially larger by a factor of nε • In this case it is only larger by a factor of lg n • Master method does not apply. • In particular, for every constant  > 0, we have n  w(lg n). Example7: 71
  • 72. Use the master method to solve the recurrence: T(n) = 4T(n/2) + n Solution: We have a = 4, b = 2 And nlogba = n2; f (n) = n. Case 1 : f(n) = O(n2 – ) for  = 1>0.  T(n) = (n2) Use the master method to solve the recurrence: T(n) = 4T(n/2) + n2 Solution: We have a = 4, b = 2 And nlogba = n2; f (n) = n2. Case 2 : f(n) = (n2 )  T(n) = (n2 logn) Example8 - 9: 72
  • 73. Use the master method to solve the recurrence: T(n) = 4T(n/2) + n3 Solution: We have a = 4, b = 2 And nlogba = n2; f (n) = n3. Case 3: f(n) = Ω(n2 + ) for  = 1. af(n/b) ≤ cf(n)  4(n/2)3  cn3 4n3/8  cn3 (reg. cond.) for c = ½<1.  T(n) = (n3) Use the master method to solve the recurrence: T(n) = 4T(n/2) + n2 /log n Solution: We have a = 4, b = 2 And nlogba = n2; f (n) = n2 /log n • seems like case 1 should apply, but f(n) must be polynomially smaller by a factor of nε • In this case it is only smaller by a factor of lgn • Master method does not apply. Example10 - 11: 73
  • 74. References 74 • T.H. Cormen, C.E. Leiserson, R.L. Rivest, C. Stein, “Introduction to Algorithms”, The MIT Press, 2009, ISBN-10: 0262033844 | ISBN-13: 978- 0262033848, 3rd Edition  chapter 4. • https://www.cse.unr.edu/~bebis/CS477/ • https://www.youtube.com/watch?v=8gt0D0IqU5w • https://www.youtube.com/watch?v=whjt_N9uYFI • https://www.youtube.com/watch?v=PSuNNYw1BNA&list=PL6EF0274BD84 9A7D5&index=5&t=13s

Editor's Notes

  1. For example, let us see how a recursion tree would provide a good guess for the recurrence T(n) = 3T(n/4) + ( n2). We start by focusing on finding an upper bound for the solution. Because we know that floors and ceilings usually do not matter when solving recurrences (here’s an example of sloppiness that we can tolerate), we create a recursion tree for the recurrence: T(n) = 3T(n/4) + cn2. having written out the implied constant coefficient c > 0. Figure 4.5 shows how we derive the recursion tree for T(n) = 3T(n/4) + cn2 For convenience, we assume that n is an exact power of 4 (another example of tolerable sloppiness) so that all subproblem sizes are integers. Part (a) of the figure shows T(n) which we expand in part (b) into an equivalent tree representing the recurrence. The cn2 term at the root represents the cost at the top level of recursion, and the three subtrees of the root represent the costs incurred by the subproblems of size n=4. Part (c) shows this process carried one step further by expanding each node with cost T(n/4) from part (b). The cost for each of the three children of the root is c(n/4)2. We continue expanding each node in the tree by breaking it into its constituent parts as determined by the recurrence.
  2. 62
  3. 63
  4. 67
  5. 68
  6. 69
  7. 71
  8. 72
  9. 73
  10. 74