Chapter 4  Cookbook Methods for Solving Recurrences 
Document last modified: 
Resources
 Wikipedia  http://en.wikipedia.org/wiki/Master_theorem
 Duke University, "BigOh for Recursive Functions: Recurrence Relations", http://www.cs.duke.edu/~ola/ap/recurrence.html
 Massachusetts Institute of Technology (MIT), "Master Theorem: Practice Problems and Solutions", http://www.csail.mit.edu/~thies/6.046web/master.pdf
 Dartmouth College, http://www.math.dartmouth.edu/archive/m19w03/public_html/Section52.pdf
Overview
If we do not know the exact closedend equation for T(n):
perhaps by backward or forward substitution find exact equation or experience, make good guess, prove holds by substituting into induction,
if no guess, use recursion tree to approximate T(n) bounds and use as the guess, prove by substituting into induction,
if divideandconquer recurrence, can often use master method directly.
Master Method for Divide and Conquer
T(n) = a * T(n/b) + f(n)
a the number of subproblems
b determines the size of subproblems
f(n) the cost of each recurrence
Example
MergeSort: T(n) = 2T(n/2) + O(n)
a = 2
b = 2
f(n) = O(n)
Example
The run time cost of f(n) is defined by the recurrence, T(n):
f(n)
if n == 1 return // Cost 6
else
f(n/2)
f(n/2)
f(n/2)
f(n/2)
return // Cost 5T(1) = 6 for n=1 T(n) = 4T(n/2) + 5 for all n>1
Master Theorem  Divide and Conquer
T(n) = a * T(n/b) + f(n) where n/b replaces either ⌊n/b⌋ or ⌈n/b⌉
a ≥ 1 the number of subproblems
b > 1 determines the problem size
f(n) = D(n) + C(n) is a function describing the cost of dividing and combining
T(n) can be asymptotically bounded for 3 cases as follows:
f(n) < n^{log}_{b}^{a } Intuitively: cost is dominated by leaves, T(n/b). If f(n) = O(n^{log}_{b}^{a  ε }) for some constant ε > 0
then T(n) ∈ Θ ( n^{log}_{b}^{a })
f(n) = n^{log}_{b}^{a} If f(n) = Θ(n^{log}_{b}^{a })
then T(n) ∈ Θ(n^{log}_{b}^{a} * lg n)
f(n) > n^{log}_{b}^{a} Intuitively: cost is dominated by root, f(n). If f(n) = Ω(n^{log}_{b}^{a +ε }) for some constant ε > 0 and
af(n/b) ≤ cf(n) for some constant c < 1 and all sufficiently large n
then T(n) ∈ Θ(f(n))
i.e. f(n) is polynomially greater than n^{log}_{b}^{a}
Question
Note that when applicable, Master Method determines a recurrence to be Θ.
Are all recurrences Θ?
Example  Case 1
T(n) = a * T(n/b) + f(n)
b  rate problem size divided
a  rate number of problems grows
f(n)  cost of each recurrence or to solve a problem of size n
n^{log}_{b}^{a }= a^{log}_{b}^{n } number of size 1 problems
f(n) < n^{log}_{b}^{a }Intuitively: cost is dominated by leaves, T(n/b). If f(n) = O(n^{log}_{b}^{a  ε }) for some constant ε > 0
then T(n) ∈ Θ ( n^{log}_{b}^{a })
T(n) = 9T(n/3) + n f(n) < n^{log}_{b}^{a }
a = 9 b = 3 f(n) = n
n^{log}_{b}^{a} = n^{log}_{3}^{9 }= n^{log}_{3}3^{2 }= n^{2} n < n^{log}_{3}^{9} = n^{2}
If f(n) = O(n^{log}_{3}^{9  ε }) for some constant ε > 0, then T(n) Θ(n^{log}_{b}^{a})f(n) = n = O(n^{log}_{3}^{9  ε }) = O(n^{2  ε}) = O(n^{2  1}) = O(n) where ε = 1 can apply Case 1 and conclude:
T(n) = Θ(n^{log}_{3}^{9}) = Θ(n^{2})
Recursion tree cost: (n^{2}n)/2 + T(1)*9^{log}_{3}^{n} = (n^{2}n)/2 + T(1)*n^{log}_{3}^{9} = (n^{2}n)/2 + T(1)*n^{2}
Example  Case 1
T(n) = a * T(n/b) + f(n)
f(n) < n^{log}_{b}^{a }Intuitively: cost is dominated by leaves, T(n/b). If f(n) = O(n^{log}_{b}^{a  ε }) for some constant ε > 0
then T(n) ∈ Θ ( n^{log}_{b}^{a })
T(n) = 8T(n/2) + Θ(n^{2})
a = 8 b = 2 f(n) = Θ(n^{2}) ⇒ f(n) = O(n^{2})
n^{log}_{2}^{8 }= n^{3 }n^{2} < n^{log}_{2}^{8} = n^{3}
If f(n) = O(n^{log}_{2}^{8  ε }) for some constant ε > 0, then T(n) Θ(n^{log}_{b}^{a})
f(n) = O(n^{log}_{2}^{8ε} ) = O(n^{log}_{2}^{81} ) = O(n^{31} ) = O(n^{2}) where ε = 1 can apply Case 1 and conclude:
T(n) = Θ (n^{log}_{2}^{8}) = Θ(n^{lg 8}) = Θ(n^{3})
Recursion tree cost: n^{3}n^{2} + T(1)*8^{log}_{2}^{n} = n^{3}n^{2} + T(1)*n^{log}_{2}^{8} = n^{3}n^{2} + T(1)*n^{3}
Question 4.14  Use Case 1 to determine:
T(n) = 16T(n/4) + n ⇒ T(n) = Θ(n^{2})
Example  Case 2
T(n) = a * T(n/b) + f(n)
T(n) = 27T(n/3) + n^{3}
If f(n) = Θ( n^{log}_{b}^{a }) then T(n) ∈ Θ( n^{log}_{b}^{a} * lg n )

Confirmation by Recursion tree

Example  Case 2
T(n) = a * T(n/b) + f(n)
b  rate problem size divided
a  rate number of problems grows
f(n)  cost of each recurrence or to solve a problem of size n
n^{log}_{b}^{a }= a^{log}_{b}^{n } number of size 1 problems
f(n) = n^{log}_{b}^{a} If f(n) = Θ( n^{log}_{b}^{a })
then T(n) ∈ Θ( n^{log}_{b}^{a} * lg n )
T(n) = T(2n/3) + 1 = T(n/(3/2)) + 1 1 = n^{0} = n^{log}_{3/2}^{1} = n^{0}
a = 1 b = 3/2 f(n) = 1
n^{log}_{3/2}^{1 }= n^{0} = 1 Recall: log_{b} 1 = 0
If f(n) = Θ( n^{log}_{b}^{a }) then T(n) ∈ Θ( n^{log}_{b}^{a} * lg n )
f(n) = 1 = Θ(n^{log}_{3/2}^{1}) = Θ(n^{0}) = Θ(1)
T(n) = Θ (n^{log}_{3/2}^{1 }lg n) = Θ(lg n)
Question 4.15  Use Case 2 to determine:
T(n) = 4T(n/2) + n^{2} ⇒ T(n) = Θ(n^{2} lg n)
Example  Case 2 (More general than text Theorem 4.1)
T(n) = a * T(n/b) + f(n)
f(n) = n^{log}_{b}^{a} If f(n) = Θ( n^{log}_{b}^{a })
then T(n) ∈ Θ( n^{log}_{b}^{a} * lg n )
T(n) = 27T(n/3) + Θ(n^{3} lg n)
a = 27 b = 3 f(n) = Θ(n^{3} lg n)
n^{log}_{3}^{27 }= n^{3}
If f(n) = Θ(n^{log}_{b}^{a }(lg n)^{k }) then T(n) ∈ Θ(n^{log}_{b}^{a} (lg n)^{k+1}) f(n) = Θ(n^{log}_{3}^{27} (lg n)^{1}) = Θ(n^{3} (lg n)^{1})
T(n) = Θ(n^{3} (lg n)^{2})
f(n) = Θ(n^{log}_{b}^{a }(lg n)^{k }), where k ≥ 0.
This formulation of Case 2 is more general than in Theorem 4.1, and it is given in Exercise 4.42.
f(n) is within a polylog factor of n^{log}_{b}^{a}, but not smaller.
f(n) = Θ(n^{3} (lg n)^{1}) within factor (lg n)^{1 } of n^{3}
Solution: T(n) = Θ(n^{log}_{b}^{a }(lg n)^{k+1 })T(n) = Θ(n^{log}_{3}^{27 }(lg n)^{1+1 }) = Θ(n^{log}_{3}^{27 }(lg n)^{2 }) = Θ(n^{3 }(lg n)^{2 })
Intuitively: cost is n^{log}_{b}^{a }(lg n)^{k} at each level, and there are Θ(lg n) levels.Simple case: k = 0 → f (n) = Θ(n^{log}_{b}^{a}) → T (n) = Θ(n^{log}_{b}^{a }lg n^{ }).
Polylog: A polylogarithmic function in n is a polynomial in the logarithm of n, e.g. (log n)^{k}
^{http://en.wikipedia.org/wiki/Polylogarithmic}
Example  Case 3
T(n) = a * T(n/b) + f(n)
b  rate problem size divided
a  rate number of problems grows
f(n)  cost of each recurrence or to solve a problem of size n
n^{log}_{b}^{a }= a^{log}_{b}^{n } number of size 1 problems
f(n) > n^{log}_{b}^{a} Intuitively: cost is dominated by root, f(n). If f(n) = Ω( n^{log}_{b}^{a +ε }) for some constant ε > 0 and
af(n/b) ≤ cf(n) for some constant c < 1 and all sufficiently large n
then T(n) ∈ Θ(f(n))
T(n) = 4T(n/2) + n^{3}
a = 4 b = 2 f(n) = n^{3}
If f(n) = Ω( n^{log}_{b}^{a+ε }) for some constant ε > 0,
f(n) = n^{3} = Ω( n^{log}_{2}^{4+ε }) = Ω( n^{2+ε }) = Ω( n^{3 }) for ε = 1
and
if af(n/b) ≤ cf(n) for some constant c < 1 and all sufficiently large n
f(n) = n lg n
af( n/b ) = 4(n/2)^{3} = n^{3}/2 for c = 1/2 ≤ cn^{3} = c f(n) then T(n) ∈ Θ( f(n) ) = Θ( n^{3} )
Example  Case 3
T(n) = a * T(n/b) + f(n)
b  rate problem size divided
a  rate number of problems grows
f(n)  cost of each recurrence or to solve a problem of size n
n^{log}_{b}^{a }= a^{log}_{b}^{n } number of size 1 problems
f(n) > n^{log}_{b}^{a} Intuitively: cost is dominated by root, f(n). If f(n) = Ω( n^{log}_{b}^{a +ε }) for some constant ε > 0 and
af(n/b) ≤ cf(n) for some constant c < 1 and all sufficiently large n
then T(n) ∈ Θ(f(n))
T(n) = 3T(n/4) + n lg n n lg n > n^{log}_{4}^{3} = n^{0.793}
a = 3 b = 4 f(n) = n lg n
n^{log}_{4}^{3 }≈ n^{0.793}
If f(n) = Ω( n^{log}_{b}^{a+ε }) for some constant ε > 0,
f(n) = n lg n = Ω( n^{log}_{4}^{3+ε }) = Ω( n^{0.793+ε }) = Ω( n^{1 }) for ε ≈ 0.2
and
if af(n/b) ≤ cf(n) for some constant c < 1 and all sufficiently large n
f(n) = n lg n
af( n/b ) = 3(n/4) lg (n/4) = 3/4 n [ lg(n/4) ] = 3/4 n [ lg n  lg 4] = 3/4 n [ lg n  lg 2^{2}] = 3/4 n [ lg n  2] = 3/4 n lg n  3n/2 ≤ 3/4 n lg n for c = 3/4 = c n lg n = c f(n) then T(n) ∈ Θ( f(n) ) = Θ( n lg n )
Question 4.16  Use Case 3 to determine:
T(n) = 3T(n/2) + n^{2} ⇒ T(n) = Θ(n^{2})
Hint: Not necessary to find an exact e, just that f(n) = Ω( n^{log}_{b}^{a+ε }) = Ω( n^{2 }) for some constant ε > 0, see graph.
If cg(n) ≤ f(n), c > 0 and ∀ n ≥ n_{0}, then f(n) ∈ Ω(g(n))
T(n) = a * T(n/b) + f(n) f(n) > n^{log}_{b}^{a} Intuitively: cost is dominated by root, f(n).
If f(n) = Ω( n^{log}_{b}^{a +ε }) for some constant ε > 0 and
af(n/b) ≤ cf(n) for some constant c < 1 and all sufficiently large n
then T(n) ∈ Θ(f(n))
Example  Master method does not apply
T(n) = 2T( n/2 ) + n lg n
a=2 b=2 f(n) = n lg n
n^{log}_{2}^{2 }= n
Case 1: Does not apply because
f(n) = n lg n ≥ O(n^{log}_{2}^{2}) = O(n)
so cannot subtract ε.
f(n) < n^{log}_{b}^{a } If f(n) = O( n^{log}_{b}^{a  ε }) for some constant ε > 0
then T(n) ∈ Θ( n^{log}_{b}^{a })
O f(n) / g(n) = c for some 0 ≤ c < ∞
but n lg n/n = ∞
T(n) = 2T( n/2 ) + n lg n
a=2 b=2 f(n) = n lg n
n^{log}_{2}^{2 }= n
Case 2: Does not apply because
f(n) = n lg n ≠ Θ( n^{log}_{2}^{2 }) = Θ(n)
and f(n) = n lg n not O(n)
f(n) = n^{log}_{b}^{a} If f(n) = Θ(n^{log}_{b}^{a })
then T(n) ∈ Θ(n^{log}_{b}^{a} * lg n)
Θ f(n) / g(n) = c for some 0 < c < ∞
but n lg n/n = ∞
T(n) = 2T( n/2 ) + n lg n
a=2 b=2 f(n) = n lg n
n^{log}_{2}^{2 }= n
Case 3: If f(n) = Ω(n^{log}_{b}^{a+ε }) for some constant ε > 0,
f(n) > n^{log}_{b}^{a} If f(n) = Ω(n^{log}_{b}^{a + ε }) for some constant ε > 0 and
af(n/b) ≤ cf(n) for some constant c < 1 and all sufficiently large n
then T(n) ∈ Θ(f(n))
Fails because while:
f(n) = n lg n = Ω( n^{log}_{2}^{2 }) = Ω(n^{1}) = Ω(n)
f(n) = n lg n ≠ Ω( n^{1+ε }) for any choice of ε > 0 so cannot add ε
To see this, recall:
ω f(n) / g(n) = c for some 0 < c ≤ ∞
f(n)/n^{log}_{2}^{2} = n lg n / n^{1} = lg n
so n lg n / n^{1} = lg n = ∞
but f(n) must be polynomially larger than n^{log}_{b}^{a} for n lg n / n^{1+ε} to hold.
n^{2} is polynomially larger than n^{1}.
lg n / n^{ε} → 0 for any constant ε > 0
n lg n/n^{1+ε} =
n lg n/n^{1}n^{ε} =
lg n / n^{ε} → 0 for any constant ε > 0
The graph shows that lg n grows more slowly than n^{0.25}
Method for Chip & Conquer
The problem of size n is chipped down into one subproblem of size nc.
T(n) = T(n  c) + f(n) If
c > 0 (the chipping factor)
f(n)  nonrecursive cost (to create subproblem and/or combine with solutions of other subproblems)
then T(n) can be asymptotically bounded as follows:
If f(n) is a polynomial n^{α}, then T(n) ∈ Θ(n^{α+1})
If f(n) is lg n, then T(n) ∈ Θ(n lg n)
Example  Factorial
int factorial( n ) {
if ( n == 1) return 1;
return n * factorial( n1 );
}Recurrence equation:
 e if n = 1
T(n) = 
 T(n  1) + d if n > 1f(n) = d so is a 0degree polynomial, n^{0}
T(n) ∈ Θ(n^{0+1}) = Θ(n)
Question 4.17  Use ChipandConquer to determine:
T(n) = T(n4) + n^{2} ⇒ T(n) = Θ(n^{3})
Method for Chip & Be Conquered
The problem of size n is chipped down into one subproblem of size nc.
T(n) = b * T(n  c) + f(n) If
 b > 1 (the branching factor)
 c > 0 (the chipping factor)
 f(n)  nonrecursive cost (to create subproblem and/or combine with solutions of other subproblems)
then T(n) in most cases can be bounded as T(n) Θ(b^{n/c}).
Grows exponentially in n so, in general cannot solve problems of significant size.
Example
 e if n = 1
T(n) = 
 5T(n  1) + d if n > 1b = 5
c = 1
f(n) = d
T(n) ∈ Θ(b^{n/c}) = Θ(5^{n/1}) = Θ( 5^{n })
Question 4.18  Use ChipandbeConquered to determine bounds for:
T(n) = 7T(n4) + n^{2}