1. Imam Abdulrahman Bin Faisal University
College of Computer Science and Information Technology
Department of Computer Science
CS 412 Algorithm Analysis and Design
Term 1 2022-2023
Chapter 4 – Part # 1
Divide-and-Conquer
2. Outline
Introduction
Recurrences and Running Time
The divide-and-conquer design paradigm
Solving Recurrences:
Iteration method
Substitution method
Recursion tree method
Master Theorem/Method
2
3. Introduction
We saw how merge sort serves as an example of the divide-and-
conquer paradigm.
The analysis of merge sort from previous lecture required us to
solve a recurrence. T(n)=2T(n/2)+cn
In this chapter, we shall see more algorithms based on divide-and-
conquer.
Binary search.
Two divide-and-conquer algorithms for multiplying n*n
matrices.
Then we will discuss different ways to solve Recurrences.
Iteration method
Substitution method
Recursion tree method
Master method
3
4. The divide-and-conquer design paradigm
1. Divide the problem (instance) into subproblems.
2. Conquer the subproblems by solving them recursively.
3. Combine subproblem solutions.
4
7. Binary search
Example: Find 9
3 5 7 8 9 12 15
Find an element in a sorted array:
1. Divide: Check middle element.
2. Conquer: Recursively search 1 subarray.
3. Combine: Trivial.
7
8. Binary search
Find an element in a sorted array:
1. Divide: Check middle element.
2. Conquer: Recursively search 1 subarray.
3. Combine: Trivial.
3 5 7 8 9 12 15
Example: Find 9
8
9. Binary search
Find an element in a sorted array:
1. Divide: Check middle element.
2. Conquer: Recursively search 1 subarray.
3. Combine: Trivial.
Example: Find 9
3 5 7 8 9 12 15
9
10. Binary search
Find an element in a sorted array:
1. Divide: Check middle element.
2. Conquer: Recursively search 1 subarray.
3. Combine: Trivial.
Example: Find 9
3 5 7 8 9 12 15
10
11. Binary search
Find an element in a sorted array:
1. Divide: Check middle element.
2. Conquer: Recursively search 1 subarray.
3. Combine: Trivial.
Example: Find 9
3 5 7 8 9 12 15
11
12. Binary search
Find an element in a sorted array:
1. Divide: Check middle element.
2. Conquer: Recursively search 1 subarray.
3. Combine: Trivial.
Example: Find 9
3 5 7 8 9 12 15
12
13. for an ordered array A, finds if x is in the array A[lo…hi]
Alg.: BINARY-SEARCH (A, lo, hi, x)
if (lo > hi)
return FALSE
mid (lo+hi)/2
if x = A[mid]
return TRUE
if ( x < A[mid] )
BINARY-SEARCH (A, lo, mid-1, x)
if ( x > A[mid] )
BINARY-SEARCH (A, mid+1, hi, x)
12
11
10
9
7
5
3
2
1 2 3 4 5 6 7 8
mid
lo hi
Binary search
13
14. Example
A[8] = {1, 2, 3, 4, 5, 7, 9, 11}
lo = 1 hi = 8 x = 7
mid = 4, lo = 5, hi = 8
mid = 6, A[mid] = x
Found!
11
9
7
5
4
3
2
1
11
9
7
5
4
3
2
1
1 2 3 4 5 6 7 8
8
7
6
5
14
15. Another Example
A[8] = {1, 2, 3, 4, 5, 7, 9, 11}
lo = 1 hi = 8 x = 6
mid = 4, lo = 5, hi = 8
mid = 6, A[6] = 7, lo = 5, hi = 5
11
9
7
5
4
3
2
1
11
9
7
5
4
3
2
1
1 2 3 4 5 6 7 8
11
9
7
5
4
3
2
1 mid = 5, A[5] = 5, lo = 6, hi = 5
NOT FOUND!
11
9
7
5
4
3
2
1
low high
low
low
high
high
15
16. Analysis of BINARY-SEARCH
Alg.: BINARY-SEARCH (A, lo, hi, x)
if (lo > hi)
return FALSE
mid (lo+hi)/2
if x = A[mid]
return TRUE
if ( x < A[mid] )
BINARY-SEARCH (A, lo, mid-1, x)
if ( x > A[mid] )
BINARY-SEARCH (A, mid+1, hi, x)
T(n) = c +
• T(n) – running time for an array of size n
constant time: c2
same problem of size n/2
same problem of size n/2
constant time: c1
constant time: c3
T(n/2)
16
18. Recurrences and Running Time
Recurrences go hand in hand with the divide-and-conquer paradigm,
because they give us a natural way to characterize the running times
of divide-and-conquer algorithms.
Recurrences arise when an algorithm contains recursive calls to itself
It is an equation or inequality that describes a function in terms of its
value on smaller inputs.
Recurrences can take many forms.
o A recursive algorithm might divide subproblems into unequal sizes, such
as a 2/3-to-1/3 split. If the divide and combine steps take linear time,
such an algorithm would give rise to the recurrence
T(n) = T(2n/3) + T(n/3) + Θ(n).
o Subproblems are not necessarily constrained to being a constant
fraction of the original problem size. For example, a recursive
version of linear search yields the recurrence T(n) = T(n-1) + Θ(1) 18
19. Example Recurrences
T(n) = T(n-1) + n Θ(n2)
o Recursive algorithm that loops through the input to eliminate
one item
T(n) = T(n/2) + c Θ(log n)
o Recursive algorithm that halves the input in one step
T(n) = T(n/2) + n Θ(nlgn)
o Recursive algorithm that halves the input but must examine
every item in the input
T(n) = 2T(n/2) + c Θ(lgn)
o Recursive algorithm that splits the input into 2 halves and
does a constant amount of other work
19
20. Recurrences and Running Time
What is the actual running time of the algorithm?
Need to solve the recurrence (obtaining asymptotic “Θ” or “O” bounds on the solution)
o Find an explicit formula of the expression
o Bound the recurrence by an expression that involves n
20
21. Methods for Solving the recurrence
21
This chapter offers four methods for solving recurrences
1. The iteration method
Convert the recurrence into a summation and try to bound it using known series
2. The substitution method
Guess a bound and then use mathematical induction to prove our guess correct.
3. The recursion-tree method
converts the recurrence into a tree whose nodes represent the costs incurred at various levels of the
recursion. We use techniques for bounding summations to solve the recurrence.
4. The master method
provides bounds for recurrences of the form : T(n) = aT(n/b) + f(n). where a >= 1, b > 1, and f(n) is a
given function.
23. The Iteration Method
Convert the recurrence into a summation and try to bound
it using known series
o Iterate the recurrence until the initial condition is
reached.
o Use back-substitution to express the recurrence in terms
of n and the initial (boundary) condition.
23
24. T(n) = c + T(n/2)
T(n) = c + T(n/2)
= c + c + T(n/ 22)
= c + c + c + T(n/ 23)
........
= c + c + c+…+ T(n/ 2k)
Assume n = 2k --- as base case when n / 2k =1 n = 2k log n = k
T(n) = c + c + … + c + T(1)
T(n)=c k + T(1)
= c lgn + T(1) ; k=log n
= Θ(lgn)
k times
T(n/2) = c + T(n/ 22)
T(n/4) = c + T(n/ 23)
24
Iteration Method – Example T(n) = c if n=1
c + T(n/2) if n>1
25. Iteration Method – Example
T(n) = n + 2T(n/2)
T(n) = n + 2 T(n/2)
= n + 2(n/2 + 2T(n/ 22))
= n + n + 22 T(n/ 22)
= n + n + 22(n/4 + 2T(n/ 23))
= n + n + n + 23 T(n/ 23)
…………………
= kn + 2kT(n/2k)
-- Assume n = 2k --- as base case when n / 2k =1 n = 2k log n = k
= kn + 2kT(1)
= nlgn + nT(1) = Θ(n lgn)
T(n/2) = n/2 + 2T(n/22)
25
T(n) = c if n=1
n + 2T(n/2) if n>1
T(n/22) = n/22 + 2T(n/23)
29. Substitution method
Now that we have seen how recurrences characterize the running times of
divide-and-conquer algorithms, we will learn how to solve recurrences using
the “substitution” method.
Its the most general method.
comprises two steps:
1. Guess the form of the solution.
2. Verify by mathematical induction to find the constants and show that the
solution works.
We substitute the guessed solution for the function when applying the inductive
hypothesis to smaller values; hence the name “substitution method.”
This method is powerful, but we must be able to guess the form of the answer
in order to apply it.
We can use the substitution method to establish either upper (big-Oh) or lower
(big-Omega) bounds on a recurrence
29
30. Substitution method
1. Guess a solution
T(n) = O(g(n))
Induction goal: apply the definition of the asymptotic notation
T(n) ≤ d g(n), for some d > 0 and n ≥ n0
Induction hypothesis: T(k) ≤ d g(k) for all k < n
2. Prove the induction goal
Use the induction hypothesis to find some values of the constants d and n0
for which the induction goal holds
(strong induction)
30
31. Example 1: Merge sort
T(n) = 2T(n/2) + n
Guess: T(n) = O(nlgn) ; g(n)=nlgn
Induction goal: T(n) ≤ c n lgn, for some c and n ≥ n0
Induction hypothesis: T(n/2) ≤ c n/2 lg(n/2)
Proof of induction goal:
T(n) = 2T(n/2) + n ≤ 2c (n/2)lg(n/2) + n
T(n) ≤ 2( c.(n/2) log.(n/2) ) + n
≤ cn log.(n/2) + n
= cn log n - cn log 2 + n
= cn lgn – cn + n = cn lgn - cn lgn – n(c-1) ≤0
if: - cn + n ≤ 0 n(c-1) ≥ 0 c ≥ 1;n ≥ 0
Base case? Next page
We guess that the solution is T(n) = O(n log n)
The substitution method requires us to
prove that T(n) = O(n log n) for an
appropriate choice of the constant c > 0.
We start by assuming that this bound holds for
all positive m < n, in particular for m = (n/2)
Substituting: T(n/2) <= c.(n/2) log.(n/2)
into the recurrence
31
32. T(n) = 2T(n/2) + n
T(n) = 2 T(n/2) + n
T(n) ≤ 2 c n/2 lg(n/2) + n = cn lg(n/2)+n = cn (lgn-lg2) + n =cn (lgn-1) +n =cnlgn – cn +n
T(n) ≤ cn lgn – n(c-1) ≤ cn lgn
-n(c-1) ≤0
n (c-1) ≥ 0; n ≥0; c ≥1
32
33. Example 1: Merge sort
T(n) = 2T(n/2) + n
Guess: T(n) = O(nlgn)
Induction goal: T(n) ≤ c n lgn, for some c and n ≥ n0
Induction hypothesis: T(n/2) ≤ cn/2 lg(n/2)
Proof of induction goal:
T(n) = 2T(n/2) + n ≤ 2c (n/2)lg(n/2) + n
T(n) ≤ 2( c.(n/2) log.(n/2) ) + n
≤ cn log.(n/2) + n
= cn log n - cn log 2 + n
= cn lgn – cn + n ≤ cn lgn
if: - cn + n ≤ 0 c ≥ 1
Base case? n=2
• Mathematical induction requires us to show
that our solution holds for the boundary
conditions.
• We do so by showing that the boundary
conditions are suitable as base cases for the
inductive proof.
• For this recurrence, we must show that we can
choose the constant c large enough so that the
bound: T(n) <= cn log n works for the boundary
conditions as well.
• BUT: T(1) = 1 is the sole boundary condition of
the recurrence. Then for n = 1, the bound T(n)
<= cn log n yields T(1) <= c log 1 = 0
• So, we take advantage of asymptotic notation
requiring us only to prove T(n) <= cn log n for n
>= n0 where n0 is a constant that we get to
choose.
• Extend boundary conditions to make the
inductive assumption work for small n .
• T(2) = 4
• we can complete the inductive proof that
T(n) <= cn log n for some constant c >= 1
by choosing c large enough so that T(2) <=
c2 log 2
• So, any choice of c >= 2 suffices for the
base cases of n = 2
33
34. Example 2: Binary Search
T(n) = T(n/2) + c
Guess: T(n) = O(lgn)
Induction goal: T(n) ≤ d lgn, for some d and n ≥ n0
Induction hypothesis: T(n/2) ≤ d lg(n/2)
Proof of induction goal:
T(n) = T(n/2) + c ≤ d lg(n/2) + c
= d lgn – d lg2 + c = d lgn – d + c ≤ d lgn
if: – d + c ≤ 0, d ≥ c
Base case? n=2
34
35. Example 3
T(n) = T(n-1) + n
Guess: T(n) = O(n2)
Induction goal: T(n) ≤ c n2, for some c and n ≥ n0
Induction hypothesis: T(n-1) ≤ c(n-1)2 for all k < n
Proof of induction goal:
T(n) = T(n-1) + n ≤ c (n-1)2 + n
= c(n2-2n+1) + n
= cn2 – (2cn – c - n) ≤ cn2 (see next page)
if: 2cn – c – n ≥ 0 c ≥ n/(2n-1) c ≥ 1/(2 – 1/n)
For n ≥ 1 2 – 1/n ≥ 1 any c ≥ 1 will work 35
36. Detail of Example 3
– (2cn – c - n) ≤0
2cn – c - n ≥ 0
c(2n-1)-n ≥ 0
c(2n-1) ≥ n
c ≥ n/(2n-1) ≥ 1/(2-1/n)
n ≥ 1; c ≥ 1
36
37. Changing variables
Rename: m = lgn n = 2m 𝑛=n^(1/2)=2^m/2
T (2m) = 2T(2m/2) + m
Rename: S(m) = T(2m)
S(m) = 2S(m/2) + m S(m) = O(mlgm) (demonstrated before)
T(n) = T(2m) = S(m) = O(mlgm)=O(lgn lg lgn)
Guess: S(m) = O(lgn lg lgn)
Induction goal: S(n) ≤ c (lgn lglgn) , for some c and n ≥ n0
Induction hypothesis: S(n/2) ≤ c lg(n/2) lg lg(n/2)
Proof of induction goal:
S(n) = 2S(2n/2) + n ≤ c lg(n/2) lg lg(n/2) + n
Idea: transform the recurrence to one that you have seen before
T(n) = 2T( ) + lgn
𝑛
37
39. The recursion-tree method
Convert the recurrence into a tree:
Each node represents the cost incurred at various levels of recursion
Sum up the costs of all levels
The recursion tree method is good for generating guesses for the
substitution method.
39
43. Example 2
E.g.: T(n) = 3T(n/4) + cn2
𝑇 𝑛 =
𝑖=0
log4 𝑛−1
3
16
𝑖
𝑐𝑛2 + Θ 𝑛log4 3 ≤
𝑖=0
∞
3
16
𝑖
𝑐𝑛2 + Θ 𝑛log4 3 =
1
1 −
3
16
𝑐𝑛2 + Θ 𝑛log4 3 = 𝑂(𝑛2)
• Subproblem size at level i is: n/4i
• Subproblem size hits 1 when 1 = n/4i i = log4n
last level (H) H = log4n
• Cost at base level (level H):
last level (H) has 3log
4
n = nlog
4
3 nodes each
contributing cost T(1).
So cost = T(1).n log
4
3 = (nlog
4
3 ).
• Number of nodes at level i = 3i
• Cost of one problem (single node) at level i =
c(n/4i)2 = (1/16i) cn2
• Cost of the problem (all nodes) at level i : 0, 1,
2; …., log4(n – 1) = 3i(n/4i)2 cn2 = (3/16)i cn2
• Total cost:
T(n) = O(n2) 43
44. • This last formula looks somewhat messy .
• We can again use an infinite decreasing geometric series as an upper bound.
• equation (A.6),
44
Instructor Notes:
T(n)=cn^2+(3/16)cn2+(3/16)^2cn2+….
T(n)=cn^2[1+(3/16)+(3/16)^2+(3/16)^3+….]
Geometric Progression r=(3/16)
𝑖=0
∞
𝑟 𝑖=1/1-r;|r|<1
T(n)=cn^2[1/(1-3/16)]
T(n)=cn^2(16/13)=θ(n^2)
45. Example 2 Explanation
Because subproblem sizes decrease by a factor of
4 each time we go down one level, we eventually
must reach a boundary condition.
How far from the root do we reach one?
The subproblem size for a node at depth i is n/4i
. Thus, the subproblem size hits n = 1 when n=4i
= 1 or, equivalently, when i = log4 n.
Thus, the tree has log4 n + 1 levels (at depths 0;
1; 2; : : : ; log4 n).
Next, we determine the cost at each level ..
Each level has three times more nodes than the
level above, and so the number of nodes at
depth i is 3i .
Because subproblem sizes reduce by a factor of 4 for
each level we go down from the root, each node at
depth i, for i = 0; 1; 2; : : : ; log4 n -1, has a cost of
c(n=4i)2. Multiplying, we see that the total cost over all
nodes at depth i, for i = 0; 1; 2; : : : ; log4 n - 1, is
3ic(n=4i)2 = (3/16)i cn2
The bottom level, at depth log4n, has 3 log
4
n = n log
4
3 =
nodes, each contributing cost T(1), for a total cost of
T(1) . n log
4
3 which is ( n log
4
3 )since we assume that
T(1) is a constant.
Now we add up the costs over all levels to determine
the cost for the entire tree:
Thus, we have derived a guess of T(n) = O(n2) for our
original recurrence. In this example, the coefficients
of cn2 form a decreasing geometric series and the sum
of these coefficients is bounded from above by the
constant 16/13 (see Eq. A.6 on previous slide).
Since the root’s contribution to the total cost is cn2,
the root contributes a constant fraction of the total
cost. In other words, the cost of the root dominates
the total cost of the tree.
45
46. Example 2 – By substitution method
T(n) = 3T(n/4) + cn2
Guess: T(n) = O(n2)
Induction goal: T(n) ≤ dn2, for some d and n ≥ n0
Induction hypothesis: T(n/4) ≤ d (n/4)2
Proof of induction goal:
T(n) = 3 T(n/4) + cn2
≤ 3d (n/4)2 + cn2
= (3/16) d n2 + cn2 ≤ d n2 if: d ≥ (16/13)c
Therefore: T(n) = O(n2)
46
Instructor notes:
c ≤ d-(3/16) d
c ≤ d(1 - 3/16)
c ≤ d(13/16)
C(16/13) ≤ d
47. Example 3
W(n) = 2W(n/2) + n2
Subproblem size at level i is: n/2i
Subproblem size hits 1 when 1 = n/2i i = lgn
Cost of the problem at level i = (n/2i)2 No. of nodes at level i = 2i
Total cost:
W(n) = O(n2)
2
2
0
2
1
lg
0
2
lg
1
lg
0
2
2
)
(
2
1
1
1
)
(
2
1
2
1
)
1
(
2
2
)
( n
n
O
n
n
O
n
n
n
W
n
n
W
i
i
n
i
i
n
n
i
i
47
47
59. Master’s method
“Cookbook” for solving recurrences of the form:
where, a ≥ 1, b > 1, and f(n) > 0
Idea: compare f(n) with n^log
b
a
f(n) is asymptotically smaller or larger than nlog
b
a by a polynomial factor n
f(n) is asymptotically equal with nlog
b
a
)
(
)
( n
f
b
n
aT
n
T
59
60. Master’s method
“Cookbook” for solving recurrences of the form:
where, a ≥ 1, b > 1, and f(n) > 0
Case 1: if f(n) = O(nlog
b
a -) for some > 0, then: T(n) = (nlog
b
a)
Case 2: if f(n) = (nlog
b
a), then: T(n) = (nlog
b
a lgn)
Case 3: if f(n) = (nlog
b
a +) for some > 0, and if
af(n/b) ≤ cf(n) for some c < 1 and all sufficiently large n, then:
T(n) = (f(n))
)
(
)
( n
f
b
n
aT
n
T
regularity condition
60
61. Why nlog
b
a? )
(
)
( n
f
b
n
aT
n
T
b
n
aT
n
T )
(
• Assume n = bk k = logbn
• At the end of iteration i = k:
a
n
n
i
i
n b
b
b
b
n
a
T
a
b
b
T
a
n
T log
log
log
log
)
1
(
)
(
2
2
b
n
T
a
3
3
b
n
T
a
𝑇(𝑛) = 𝑎𝑖
𝑇
𝑛
𝑏𝑖
∀𝑖
• Case 1:
– If f(n) is dominated by nlog
b
a:
• T(n) = (nlog
b
n)
• Case 3:
– If f(n) dominates nlog
b
a:
• T(n) = (f(n))
• Case 2:
– If f(n) = (nlog
b
a):
• T(n) = (nlog
b
a logn)
61
62. The Master Method
provides a “cookbook” method for solving recurrences of the form
This recurrence describes the running time of an algorithm that
divides a problem of size n into a subproblems, each of size n/b,
where a and b are positive constants.
is a constant that shows in how many subproblems, the
original is divided.
is a constant that shows the size of the subproblem.
The a subproblems are solved recursively, each in time T(n/b).
f(n) is an asymptotically positive function, encompasses the cost of
dividing the problem and combining the results of the subproblems.
Note that may not be an integer, but the analysis does not
change for both or .
62
63. The Master Theorem
• Let a >= 1 and b > 1 be constants, let f(n) be a function, and let T(n)
be defined on the nonnegative integers by the recurrence:
T(n) = aT(n/b) + f(n)
• Then T(n) has the following asymptotic bounds:
• Three cases:
1.f (n) = O(nlog
b
a – ε) for some constant ε> 0,
• f (n) grows polynomially slower than nlogba (by an nε factor).
• Solution: T(n) = Θ (nlog
b
a).
2.f (n) = Θ (nlog
b
a ), then T(n) = Θ (nlogba lgn).
• the two functions: f(n) and nlog
b
a are the same size (grow at similar rates), we
multiply by a logarithmic factor
• Solution: T(n) = Θ (nlog
b
a lgn) = Θ (f(n) lgn).
3.f (n) = Ω(nlog
b
a + ε) for some constant ε> 0, and if af(n/b) <= cf(n) for some constant
c < 1 and all sufficiently large n.
• f (n) grows polynomially slower than nlogba (by an nε factor).
• Solution: T(n) = Θ (f(n)). 63
64. Example1:
Use the master method to solve the recurrence: T(n) = 2T(n/2) + n
Solution: a = 2, b = 2, log22 = 1
Compare nlog
2
2 =n; with f(n) = n
Case 2: f(n) = (n)
T(n) = (nlgn)
64
65. Example2:
Use the master method to solve the recurrence: T(n) = 2T(n/2) + n2
Solution: a = 2, b = 2, log22 = 1
Compare nlog
2
2 =n with f(n) = n2
Case 3: f(n) = (nlog
b
a +)
f(n) = (n1+)
n2 = n1+
2=1+ꞒꞒ=2-1=1>0
verify regularity condition:
a f(n/b) ≤ c f(n)
2 n2/4 ≤ c n2 c = ½ is a solution (c<1)
T(n) = (n2)
65
66. Example3:
Use the master method to solve the recurrence: T(n) = 2T(n/2) +
Solution: a = 2, b = 2, log22 = 1
Compare nlog
2
2 =n with f(n) = n1/2
Case 1: nlog
b
a – ε
1- 1/2 11/20.5>0
f(n) = O(n1-) , = 1-0.5 = 0.5
T(n) = (n)
n
66
67. Use the master method to solve the recurrence T(n) = 9T(n/3) + n
Solution: We have
And
= n^2
Case 1:
T(n) = (n2).
Example4:
67
68. Use the master method to solve the recurrence: T(n) = T(2n/3) + 1
T(n) = T(2n/3) + 1=T(n/(3/2))+1
Solution: We have
And
Since (3/2)^0=1
Case 2:
T(n) = (log n).
Example5:
68
69. Use the master method to solve the recurrence: T(n) = 3T(n/4) + n lg n
Solution: We have
And
0.793+Ꞓ=1 Ꞓ=1-0.793~=0.207>0
Case 3:
verify regularity condition: af(n/b) ≤ cf(n)
af(n/b)=3f(n/4)=
Where:
T(n) = (n log n)
Example6:
3 3 3
( / ) 3 lg (lg lg 4) lg
4 4 4 4 2
3 3
lg ( lg ) ( )
4 4
n n n n n
af n b n n
n
n n n cf n
3/ 4 1
c
69
71. Use the master method to solve the recurrence: T(n) = 2T(n/2) + n lg n
Solution: We have : a = 2, b = 2, log22 = 1
And Compare nlog
2
2 =n with f(n) = n lgn
seems like case 3 should apply, BUT
• f(n) must be polynomially larger by a factor of nε
• In this case it is only larger by a factor of lg n
• Master method does not apply.
• In particular, for every constant > 0, we have n w(lg n).
Example7:
71
72. Use the master method to solve the recurrence: T(n) = 4T(n/2) + n
Solution: We have a = 4, b = 2
And nlogba = n2; f (n) = n.
Case 1 : f(n) = O(n2 – ) for = 1>0.
T(n) = (n2)
Use the master method to solve the recurrence: T(n) = 4T(n/2) + n2
Solution: We have a = 4, b = 2
And nlogba = n2; f (n) = n2.
Case 2 : f(n) = (n2 )
T(n) = (n2 logn)
Example8 - 9:
72
73. Use the master method to solve the recurrence: T(n) = 4T(n/2) + n3
Solution: We have a = 4, b = 2
And nlogba = n2; f (n) = n3.
Case 3: f(n) = Ω(n2 + ) for = 1.
af(n/b) ≤ cf(n) 4(n/2)3 cn3 4n3/8 cn3 (reg. cond.) for c = ½<1.
T(n) = (n3)
Use the master method to solve the recurrence: T(n) = 4T(n/2) + n2 /log n
Solution: We have a = 4, b = 2
And nlogba = n2; f (n) = n2 /log n
• seems like case 1 should apply, but f(n) must be polynomially smaller by a factor of nε
• In this case it is only smaller by a factor of lgn
• Master method does not apply.
Example10 - 11:
73
74. References
74
• T.H. Cormen, C.E. Leiserson, R.L. Rivest, C. Stein, “Introduction to
Algorithms”, The MIT Press, 2009, ISBN-10: 0262033844 | ISBN-13: 978-
0262033848, 3rd Edition chapter 4.
• https://www.cse.unr.edu/~bebis/CS477/
• https://www.youtube.com/watch?v=8gt0D0IqU5w
• https://www.youtube.com/watch?v=whjt_N9uYFI
• https://www.youtube.com/watch?v=PSuNNYw1BNA&list=PL6EF0274BD84
9A7D5&index=5&t=13s
Editor's Notes
For example, let us see how a recursion tree would provide a good guess for the recurrence T(n) = 3T(n/4) + ( n2).
We start by focusing on finding an upper bound for the solution. Because we know that floors and ceilings usually do not matter when solving recurrences (here’s an example of sloppiness that we can tolerate), we create a recursion tree for the recurrence: T(n) = 3T(n/4) + cn2.
having written out the implied constant coefficient c > 0.
Figure 4.5 shows how we derive the recursion tree for T(n) = 3T(n/4) + cn2 For convenience, we assume that n is an exact power of 4 (another example of tolerable sloppiness) so that all subproblem sizes are integers.
Part (a) of the figure shows T(n) which we expand in part (b) into an equivalent tree representing the recurrence.
The cn2 term at the root represents the cost at the top level of recursion, and the three subtrees of the root represent the costs incurred by the subproblems of size n=4.
Part (c) shows this process carried one step further by expanding each node with cost T(n/4) from part (b). The cost for each of the three children of the root is c(n/4)2. We continue expanding each node in the tree by breaking it into its constituent parts as determined by the recurrence.