# Chapter 4 - Cookbook Methods for Solving Recurrences

Resources

1. Wikipedia - http://en.wikipedia.org/wiki/Master_theorem
2. Duke University, "Big-Oh for Recursive Functions: Recurrence Relations", http://www.cs.duke.edu/~ola/ap/recurrence.html
3. Massachusetts Institute of Technology (MIT), "Master Theorem: Practice Problems and Solutions", http://www.csail.mit.edu/~thies/6.046-web/master.pdf
4. Dartmouth College, http://www.math.dartmouth.edu/archive/m19w03/public_html/Section5-2.pdf

Overview

If we do not know the exact closed-end equation for T(n):

1. perhaps by backward or forward substitution find exact equation or experience, make good guess, prove holds by substituting into induction,

2. if no guess, use recursion tree to approximate T(n) bounds and use as the guess, prove by substituting into induction,

3. if divide-and-conquer recurrence, can often use master method directly.

Master Method for Divide and Conquer

T(n) = a * T(n/b) + f(n)

• a  the number of subproblems

• b  determines the size of subproblems

• f(n) the cost of each recurrence

Example

MergeSort: T(n) = 2T(n/2) + O(n)

a = 2

b = 2

f(n) = O(n)

Example

The run time cost of f(n) is defined by the recurrence, T(n):

 f(n)     if n == 1 return  // Cost 6     else                          f(n/2)                        f(n/2)        f(n/2)        f(n/2)          return              // Cost 5 T(1) = 6                     for n=1T(n) = 4T(n/2) + 5     for all n>1

Master Theorem - Divide and Conquer

T(n) = a * T(n/b) + f(n)                           where n/b replaces either ⌊n/b⌋ or ⌈n/b⌉
• a ≥ 1 the number of subproblems

• b > 1 determines the problem size

• f(n) = D(n) + C(n) is a function describing the cost of dividing and combining

T(n) can be asymptotically bounded for 3 cases as follows:

1.  f(n) < nlogba                Intuitively: cost is dominated by leaves, T(n/b).If f(n) = O(nlogba - ε ) for some constant ε > 0 then T(n) ∈ Θ ( nlogba )

2.  f(n) = nlogbaIf f(n) = Θ(nlogba ) then T(n) ∈ Θ(nlogba * lg n)

3.  f(n) > nlogba           Intuitively: cost is dominated by root, f(n).If f(n) = Ω(nlogba +ε ) for some constant ε > 0 and af(n/b) ≤ cf(n) for some constant c < 1 and all sufficiently large n then T(n) ∈ Θ(f(n)) i.e. f(n) is polynomially greater than nlogba

Question

Note that when applicable, Master Method determines a recurrence to be Θ.

Are all recurrences Θ?

Example - Case 1

T(n) = a * T(n/b) + f(n)

b - rate problem size divided

a - rate number of problems grows

f(n) - cost of each recurrence or to solve a problem of size n

nlogba = alogbn - number of size 1 problems

 f(n) < nlogba                Intuitively: cost is dominated by leaves, T(n/b).If f(n) = O(nlogba - ε ) for some constant ε > 0 then T(n) ∈ Θ ( nlogba )

T(n) = 9T(n/3) + n                                                     f(n) < nlogba

a = 9        b = 3         f(n) = n

nlogba = nlog39 = nlog332 = n2                             n < nlog39 = n2

If f(n) = O(nlog39 - ε ) for some constant ε > 0, then T(n) Θ(nlogba)

f(n) = n = O(nlog39 - ε ) = O(n2 - ε) = O(n2 - 1) = O(n) where ε = 1 can apply Case 1 and conclude:

T(n) = Θ(nlog39) = Θ(n2)

Recursion tree cost:   (n2-n)/2 + T(1)*9log3n = (n2-n)/2 + T(1)*nlog39 = (n2-n)/2 + T(1)*n2

Example - Case 1

T(n) = a * T(n/b) + f(n)

 f(n) < nlogba                Intuitively: cost is dominated by leaves, T(n/b).If f(n) = O(nlogba - ε ) for some constant ε > 0 then T(n) ∈ Θ ( nlogba )

T(n) = 8T(n/2) + Θ(n2)

a = 8       b = 2       f(n) = Θ(n2) ⇒ f(n) = O(n2)

nlog28 = n3                                                                     n2 < nlog28 = n3

If f(n) = O(nlog28 - ε ) for some constant ε > 0, then T(n) Θ(nlogba)

f(n) = O(nlog28-ε ) = O(nlog28-1 )  = O(n3-1 ) = O(n2) where ε = 1 can apply Case 1 and conclude:

T(n) = Θ (nlog28) = Θ(nlg 8)  = Θ(n3)

Recursion tree cost:   n3-n2 + T(1)*8log2n = n3-n2 + T(1)*nlog28 = n3-n2 + T(1)*n3

Question 4.14 - Use Case 1 to determine:

T(n) = 16T(n/4) + n ⇒ T(n) = Θ(n2)

Example - Case 2

T(n) = a * T(n/b) + f(n)

b - rate problem size divided

a - rate number of problems grows

f(n) - cost of each recurrence or to solve a problem of size n

nlogba = alogbn - number of size 1 problems

 f(n) = nlogbaIf f(n) = Θ( nlogba ) then T(n) ∈ Θ( nlogba * lg n )

T(n) = 27T(n/3) + n3

f(n) = nlog327 = n3 = Θ(n3)

a = 27       b = 3         f(n) = n3

If f(n) = Θ( nlogba ) then T(n) ∈ Θ( nlogba * lg n )

f(n) = n3 = Θ(nlog327) = Θ(n3)

T(n) = Θ(nlog327 lg n) = Θ(n3 lg n)

Confirmation by Recursion tree

 Level0 1 2 3 : log3 n-1log3 n (n/30)3 *270 = n3*(27/33)0 = (n/31)3 *271 = n3*(27/33)1 =  (n/32)3 *272 = n3*(27/33)2 = (n/33)3 *273 = n3*(27/33)3 =            : T(1)*nlog327=T(1)*n3 n3 n3 n3 n3 :   n3 ________ n3 log3 n

 T(n) = n3 log3n+n3 ∈ Θ( n3*lg n ) Note: lg n/log3n = constant Different base logarithms differ only by constant factor. Can consider the same when don't care about constant factors such as O-notation. See p. 57 of text.

Example - Case 2

T(n) = a * T(n/b) + f(n)

b - rate problem size divided

a - rate number of problems grows

f(n) - cost of each recurrence or to solve a problem of size n

nlogba = alogbn - number of size 1 problems

 f(n) = nlogbaIf f(n) = Θ( nlogba ) then T(n) ∈ Θ( nlogba * lg n )

T(n) = T(2n/3) + 1 = T(n/(3/2)) + 1                1 = n0 = nlog3/21 = n0

a = 1       b = 3/2         f(n) = 1

nlog3/21 = n0 = 1                                   Recall: logb 1 = 0

If f(n) = Θ( nlogba ) then T(n) ∈ Θ( nlogba * lg n )

f(n) = 1 = Θ(nlog3/21) = Θ(n0) = Θ(1)

T(n) = Θ (nlog3/21 lg n) = Θ(lg n)

Question 4.15 - Use Case 2 to determine:

T(n) = 4T(n/2) + n2 ⇒ T(n) = Θ(n2 lg n)

Example - Case 2 (More general than text Theorem 4.1)

T(n) = a * T(n/b) + f(n)

 f(n) = nlogbaIf f(n) = Θ( nlogba ) then T(n) ∈ Θ( nlogba * lg n )

T(n) = 27T(n/3) + Θ(n3 lg n)

a = 27         b = 3       f(n) = Θ(n3 lg n)

nlog327 = n3

 If f(n) = Θ(nlogba (lg n)k ) then T(n) ∈ Θ(nlogba (lg n)k+1)

f(n) = Θ(nlog327 (lg n)1) = Θ(n3 (lg n)1)

T(n) = Θ(n3 (lg n)2)

f(n) = Θ(nlogba (lg n)k ), where k ≥ 0.

This formulation of Case 2 is more general than in Theorem 4.1, and it is given in Exercise 4.4-2.

f(n) is within a polylog factor of nlogba, but not smaller.

f(n) = Θ(n3 (lg n)1) within factor (lg n)1 of n3

Solution: T(n) = Θ(nlogba (lg n)k+1 )

T(n) = Θ(nlog327 (lg n)1+1 ) = Θ(nlog327 (lg n)2 ) = Θ(n3 (lg n)2 )

Intuitively: cost is nlogba (lg n)k at each level, and there are Θ(lg n) levels.

Simple case: k = 0 → f (n) = Θ(nlogba) → T (n) = Θ(nlogba lg n ).

Polylog: A polylogarithmic function in n is a polynomial in the logarithm of n, e.g. (log n)k

http://en.wikipedia.org/wiki/Polylogarithmic

Example - Case 3

T(n) = a * T(n/b) + f(n)

b - rate problem size divided

a - rate number of problems grows

f(n) - cost of each recurrence or to solve a problem of size n

nlogba = alogbn - number of size 1 problems

 f(n) > nlogba                               Intuitively: cost is dominated by root, f(n).If f(n) = Ω( nlogba +ε ) for some constant ε > 0 and af(n/b) ≤ cf(n) for some constant c < 1 and all sufficiently large n then T(n) ∈ Θ(f(n))

T(n) = 4T(n/2) + n3

a = 4       b = 2       f(n) =  n3

If f(n) = Ω( nlogba+ε ) for some constant ε > 0,

f(n) = n3 = Ω( nlog24) = Ω( n2+ε ) = Ω( n3 ) for ε = 1

and

if af(n/b) ≤ cf(n) for some constant c < 1 and all sufficiently large n

f(n) = n lg n

 af( n/b ) = 4(n/2)3 = n3/2 for c = 1/2 ≤ cn3 = c f(n)

then T(n) ∈ Θ( f(n) ) = Θ( n3 )

Example - Case 3

T(n) = a * T(n/b) + f(n)

b - rate problem size divided

a - rate number of problems grows

f(n) - cost of each recurrence or to solve a problem of size n

nlogba = alogbn - number of size 1 problems

 f(n) > nlogba                               Intuitively: cost is dominated by root, f(n).If f(n) = Ω( nlogba +ε ) for some constant ε > 0 and af(n/b) ≤ cf(n) for some constant c < 1 and all sufficiently large n then T(n) ∈ Θ(f(n))

T(n) = 3T(n/4) + n lg n                                            n lg n > nlog43 = n0.793

a = 3       b = 4       f(n) =  n lg n

nlog43 ≈ n0.793

If f(n) = Ω( nlogba+ε ) for some constant ε > 0,

f(n) = n lg n = Ω( nlog43) = Ω( n0.793+ε ) = Ω( n1 ) for ε ≈ 0.2

and

if af(n/b) ≤ cf(n) for some constant c < 1 and all sufficiently large n

f(n) = n lg n

 af( n/b ) = 3(n/4) lg (n/4) = 3/4 n [ lg(n/4) ] = 3/4 n [ lg n - lg 4] = 3/4 n [ lg n - lg 22] = 3/4 n [ lg n - 2] = 3/4 n lg n - 3n/2 ≤ 3/4 n lg n for c = 3/4 = c n lg n = c f(n)

then T(n) ∈ Θ( f(n) ) = Θ( n lg n )

Question 4.16 - Use Case 3 to determine:

T(n) = 3T(n/2) + n2 ⇒ T(n) = Θ(n2)

Hint: Not necessary to find an exact e, just that  f(n) = Ω( nlogba+ε ) =  Ω( n2 )  for some constant ε > 0, see graph.

 If cg(n) ≤ f(n), c > 0 and ∀ n ≥ n0, then f(n) ∈ Ω(g(n))
 T(n) = a * T(n/b) + f(n)f(n) > nlogba                              Intuitively: cost is dominated by root, f(n). If f(n) = Ω( nlogba +ε ) for some constant ε > 0 and af(n/b) ≤ cf(n) for some constant c < 1 and all sufficiently large n then T(n) ∈ Θ(f(n))

Example - Master method does not apply

T(n) = 2T( n/2 ) + n lg n

a=2       b=2      f(n) = n lg n

nlog22 = n

Case 1: Does not apply because

f(n) = n lg n ≥ O(nlog22) = O(n)

so cannot subtract ε.

 f(n) < nlogba                If f(n) = O( nlogba - ε ) for some constant ε > 0 then T(n) ∈ Θ( nlogba )

f(n) / g(n) = c for some 0 ≤ c < ∞

but n lg n/n = ∞

T(n) = 2T( n/2 ) + n lg n

a=2       b=2      f(n) = n lg n

nlog22 = n

Case 2: Does not apply because

f(n) = n lg n ≠ Θ( nlog22 ) = Θ(n)

and f(n) = n lg n not O(n)

 f(n) = nlogbaIf f(n) = Θ(nlogba ) then T(n) ∈ Θ(nlogba * lg n)

Θ  f(n) / g(n) = c for some 0 < c < ∞

but n lg n/n = ∞

T(n) = 2T( n/2 ) + n lg n

a=2       b=2      f(n) = n lg n

nlog22 = n

Case 3: If f(n) = Ω(nlogba+ε ) for some constant ε > 0,

 f(n) > nlogba           If f(n) = Ω(nlogba + ε ) for some constant ε > 0 and af(n/b) ≤ cf(n) for some constant c < 1 and all sufficiently large n then T(n) ∈ Θ(f(n))

Fails because while:

f(n) = n lg n = Ω( nlog22 ) = Ω(n1) = Ω(n)

f(n) = n lg n ≠ Ω( n1+ε ) for any choice of ε > 0 so cannot add ε

To see this, recall:

ω  f(n) / g(n) = c for some 0 < c ≤ ∞

f(n)/nlog22 = n lg n / n1 = lg n

so n lg n / n1 = lg n = ∞

but f(n) must be polynomially larger than nlogba for n lg n / n1+ε to hold.

n2 is polynomially larger than n1.

lg n / nε → 0 for any constant ε > 0

n lg n/n1+ε =

n lg n/n1nε =

lg n / nε → 0                 for any constant ε > 0

The graph shows that lg n grows more slowly than n0.25

Method for Chip & Conquer

The problem of size n is chipped down into one subproblem of size n-c.

 T(n) = T(n - c) + f(n) If c > 0 (the chipping factor) f(n) - nonrecursive cost (to create subproblem and/or combine with solutions of other subproblems) then T(n) can be asymptotically bounded as follows: If f(n) is a polynomial nα, then T(n) ∈ Θ(nα+1) If f(n) is lg n, then T(n) ∈ Θ(n lg n)

Example - Factorial

 int factorial( n ) {    if ( n == 1) return 1;    return n * factorial( n-1 ); }

Recurrence equation:

 | e                     if n = 1T(n) = |            | T(n - 1) + d     if n > 1

f(n) = d so is a 0-degree polynomial, n0

T(n) ∈ Θ(n0+1) = Θ(n)

Question 4.17 - Use Chip-and-Conquer to determine:

T(n) = T(n-4) + n2 ⇒ T(n) = Θ(n3)

Method for Chip & Be Conquered

The problem of size n is chipped down into one subproblem of size n-c.

 T(n) = b * T(n - c) + f(n)If b > 1 (the branching factor) c > 0 (the chipping factor) f(n) - nonrecursive cost (to create subproblem and/or combine with solutions of other subproblems) then T(n) in most cases can be bounded as T(n) Θ(bn/c).

Grows exponentially in n so, in general cannot solve problems of significant size.

Example

 | e                       if n = 1T(n) = |            | 5T(n - 1) + d     if n > 1

b = 5

c = 1

f(n) = d

T(n) ∈ Θ(bn/c) = Θ(5n/1) = Θ( 5n )

Question 4.18 - Use Chip-and-be-Conquered to determine bounds for:

T(n) = 7T(n-4) + n2