CS 3110 Lecture 20
Recursion trees and master method for recurrence relations

Note: this page uses the following special characters: Greek capital letter theta: (Θ), Greek capital letter omega (Ω), minus sign (−). If these characters do not appear correctly, your browser is not able to fully handle HTML 4.0, and some of the following text will likely not have the correct appearance.

Recursion trees

A recursion tree is useful for visualizing what happens when a recurrence is iterated. It diagrams the tree of recursive calls, and the amount of work done at each call.

For instance consider the recurrence

T(n)=2T(n/2) + n2.

The recursion tree for this recurrence is of the following form:

          |                   n2
          |               /        \
          |          (n/2)2          (n/2)2
 height=  |         /    \          /    \
  lg n    |      (n/4)2 (n/4)2    (n/4)2 (n/4)2
          |      /  \   /  \      /  \   /  \
          |                    .
          |                    .
          |                    .

Generally it is straightforward to sum across each row of the tree, to obtain the total work done at a given level:

          |                   n2                                n2
          |               /        \
          |          (n/2)2          (n/2)2                   (1/2)n2
 height=  |         /    \          /    \
  lg n    |      (n/4)2 (n/4)2    (n/4)2 (n/4)2               (1/4)n2
          |      /  \   /  \      /  \   /  \
          |                    .
          |                    .
          |                    .

This is a geometric series, and thus in the limit the sum is O(n2). In other words the depth of the tree in this case does not really matter, the amount of work at each level is decreasing so quickly that the total is only a constant factor more than the root.

Recursion trees can be useful for gaining intuition into the closed form of a recurrence, but are not a proof (and in fact it is easy to get the wrong answer with a recursion tree, as is the case with any method that includes ''...'' kinds of reasoning). As we saw last time, a good way of establishing a closed form for a recurrence is to guess an answer and then prove by induction that the answer is correct. Recurrence trees can be a good method of guessing an answer.

Let's consider another example,

T(n)=T(n/3)+2T(n/3)+n.

Expanding out the first few levels, the recurrence tree is:

          |                   n                                 n
          |               /        \
          |          (n/3)          (2n/3)                      n
 height=  |         /    \          /    \
  log3/2 n |      (n/9) (2n/9)    (2n/9) (4n/9)                  n
          |      /  \   /  \      /  \   /  \
          |                    .
          |                    .
          |                    .

Note that the tree here is not balanced, the longest path keeps reducing n by a factor of 2/3 and thus is of length log3/2 n. Hence our guess as to the closed form of this recurrence is O(n lg n).

The master method

The “master method” is a cookbook method for solving recurrences that is very handy for dealing with many recurrences seen in practice. Suppose you have a recurrence of the form

T(n)=aT(n/b)+f(n).

In other words with a subproblems each of size n/b, where the work to split the problem into subproblems and recombine the results is f(n).

We can visualize this as a recurrence tree, where the nodes in the tree have a branching factor of a. The top node has work f(n) associated with it, the next level has work f(n/b) associated with each node, the next level has work f(n/b2) associated with each node, and so on. The tree has logbn levels, so the total number of leaves in the tree is alogbn which, as a function of n is nlogba.

The time taken is just the sum of the terms f(n/bi) at all the nodes. What this sum looks like depends on how the asymptotic growth of f(n) compares to the asymptotic growth of the number of leaves. There are three cases:

Note that the master method does not always apply. In fact the second example considered above, where the subproblem sizes are unequal, is not covered by the master method.

Let's look at a few examples where the master method does apply.

Example 1 Consider the recurrence

T(n)=4T(n/2)+n.

For this recurrence, there are a=4 subproblems, each dividing the input by b=2, and the work done on each call is f(n)=n. Thus nlogba is n2, and f(n) is O(n2-ε) for ε=1, and Case 1 applies. Thus T(n) is Θ(n2).

Example 2 Consider the recurrence

T(n)=4T(n/2)+n2.

For this recurrence, there are again a=4 subproblems, each dividing the input by b=2, but now the work done on each call is f(n)=n2. Again nlogba is n2, and f(n) is thus Θ(n2), so Case 2 applies. Thus T(n) is Θ(n2 lg n). Note that increasing the work on each recursive call from linear to quadratic has increased the overall asymptotic running time only by a logarithmic factor.

Example 3 Consider the recurrence

T(n)=4T(n/2)+n3.

For this recurrence, there are again a=4 subproblems, each dividing the input by b=2, but now the work done on each call is f(n)=n3. Again nlogba is n2, and f(n) is thus Ω(n2+ε) for ε=1. Moreover, 4(n/2)3 ≤ kn3 for k=1/2, so Case 3 applies. Thus T(n) is Θ(n3).

Example: Yet another sorting algorithm

The following function sorts the first two-thirds of a list, then the second two-thirds, then the first two-thirds again:

let rec sort3 a =
  match a with
      [] -> []
    | [x] -> [x]
    | [x;y] -> [(min x y); (max x y)]
    | _ -> 
	let n = List.length(a) in
	let m = (2*n+2) / 3 in
	let res1 = sort3(take a m) @ (drop a m) in
	let res2 = (take res1 (n-m)) @ sort3(drop res1 (n-m)) in
	  sort3(take res2 m) @ (drop res2 m)

Perhaps surprisingly, this algorithm does sort the list. We leave the proof that it sorts correctly as an exercise to the reader. The key is to observe that the first two passes ensure that the tail of the final list does contain the correct elements in the correct order.

The running time of the algorithm we can derive from its recurrence. The routine does some O(n) work and then makes three recursive calls on lists of length 2n/3. Therefore its recurrence is:

T(n) = cn + 3T(2n/3)

If we apply the master method to the sort3 algorithm, we see easily that we are in case 1, so the algorithm is O(nlog3/23) = O(n2.71), making it even slower than insertion sort! Note that the fact that Case 1 applies means that improving f(n) will not improve the overall time. For instance replacing lists with arrays improves f(n) to constant from linear time, but the asymptotic complexity is still O(n2.71).