Recursion trees and master method for recurrence relations

**Note:** this page uses the following special characters: Greek capital letter theta: (Θ),
Greek capital letter omega (Ω), minus sign (−). If these characters do not appear correctly,
your browser is not able to fully handle HTML 4.0, and some of the following text will likely not have the correct appearance.

A **recursion tree** is useful for visualizing what happens when a
recurrence is iterated. It diagrams the tree of recursive calls, and
the amount of work done at each call.

For instance consider the recurrence

*T(n)=2T(n/2) + n ^{2}*.

The recursion tree for this recurrence is of the following form:

| n^{2}| / \ | (n/2)^{2}(n/2)^{2}height= | / \ / \ lg n | (n/4)^{2}(n/4)^{2}(n/4)^{2}(n/4)^{2}| / \ / \ / \ / \ | . | . | .

Generally it is straightforward to sum across each row of the tree, to obtain the total work done at a given level:

| n^{2}n^{2}| / \ | (n/2)^{2}(n/2)^{2}(1/2)n^{2}height= | / \ / \ lg n | (n/4)^{2}(n/4)^{2}(n/4)^{2}(n/4)^{2}(1/4)n^{2}| / \ / \ / \ / \ | . | . | .

This is a geometric series, and thus in the limit the sum
is *O(n ^{2})*. In other words the depth of the tree in
this case does not really matter, the amount of work at each level is
decreasing so quickly that the total is only a constant factor more
than the root.

Recursion trees can be useful for gaining intuition into the closed form of a recurrence, but are not a proof (and in fact it is easy to get the wrong answer with a recursion tree, as is the case with any method that includes ''...'' kinds of reasoning). As we saw last time, a good way of establishing a closed form for a recurrence is to guess an answer and then prove by induction that the answer is correct. Recurrence trees can be a good method of guessing an answer.

Let's consider another example,

*T(n)=T(n/3)+2T(n/3)+n*.

Expanding out the first few levels, the recurrence tree is:

| n n | / \ | (n/3) (2n/3) n height= | / \ / \ log_{3/2}n | (n/9) (2n/9) (2n/9) (4n/9) n | / \ / \ / \ / \ | . | . | .

Note that the tree here is not balanced, the longest path keeps
reducing *n* by a factor of *2/3* and thus is of
length *log _{3/2} n*. Hence our guess as to the closed
form of this recurrence is

The “master method” is a cookbook method for solving recurrences that is very handy for dealing with many recurrences seen in practice. Suppose you have a recurrence of the form

*T(n)=aT(n/b)+f(n)*.

In other words with *a* subproblems each of size *n/b*,
where the work to split the problem into subproblems and recombine the
results is *f(n)*.

We can visualize this as a recurrence tree, where the nodes in the tree have
a branching factor of *a*. The top node has work
*f(n)* associated with it, the next level has work *f(n/b)*
associated with each node, the next level has
work *f(n/b ^{2})* associated with each node, and so
on. The tree has

The time taken is just the sum of the terms *f(n/b ^{i})*
at all the nodes. What this sum looks like depends on how the
asymptotic growth of

- Case 1:
*f*(*n*) is O(*n*^{logba - ε}). Since the leaves grow faster than*f*, asymptotically all of the work is done at the leaves, and so*T*(*n*) is Θ(n^{logb a}). - Case 2:
*f*(*n*) is Θ(*n*^{logba}). The leaves grow at the same rate as h, so the same order of work is done at every level of the tree. The tree has O(lg n) levels, times the work done on one level, yielding*T*(*n*) is &Theta(n^{logb a}lg*n*). - Case 3:
*f*(*n*) is Ω(*n*^{logba + ε}). In this case we also need to show that*af*(*n*/*b*)≤*kf*(*n*) for some constant k and large*n*, which means that*f*grows faster than the number of leaves. Asymptotically all of the work is done at the root node, so*T*(*n*) is Θ(*f*(*n*)).

Note that the master method does not always apply. In fact the second example considered above, where the subproblem sizes are unequal, is not covered by the master method.

Let's look at a few examples where the master method does apply.

__Example 1__ Consider the recurrence

*T(n)=4T(n/2)+n*.

For this recurrence, there are *a=4* subproblems, each
dividing the input by *b=2*, and the work done on each call
is *f(n)=n*.
Thus *n*^{logba}
is *n ^{2}*, and

__Example 2__ Consider the recurrence

*T(n)=4T(n/2)+n ^{2}*.

For this recurrence, there are again *a=4* subproblems, each
dividing the input by *b=2*, but now the work done on each call
is *f(n)=n ^{2}*.
Again

__Example 3__ Consider the recurrence

*T(n)=4T(n/2)+n ^{3}*.

For this recurrence, there are again *a=4* subproblems, each
dividing the input by *b=2*, but now the work done on each call
is *f(n)=n ^{3}*.
Again

The following function sorts the first two-thirds of a list, then the second two-thirds, then the first two-thirds again:

let rec sort3 a = match a with [] -> [] | [x] -> [x] | [x;y] -> [(min x y); (max x y)] | _ -> let n = List.length(a) in let m = (2*n+2) / 3 in let res1 = sort3(take a m) @ (drop a m) in let res2 = (take res1 (n-m)) @ sort3(drop res1 (n-m)) in sort3(take res2 m) @ (drop res2 m)

Perhaps surprisingly, this algorithm does sort the list. We leave the proof that it sorts correctly as an exercise to the reader. The key is to observe that the first two passes ensure that the tail of the final list does contain the correct elements in the correct order.

The running time of the algorithm we can derive from its recurrence.
The routine does some O(*n*)
work and then makes three recursive calls on lists of length 2*n*/3.
Therefore its recurrence is:

T(n) =cn+ 3T(2n/3)

If we apply the master method to the `sort3`

algorithm, we
see easily that we are in case 1, so the algorithm
is O(*n*^{log3/23}) =
O(n^{2.71}), making it even slower than insertion sort!
Note that the fact that Case 1 applies means that
improving *f(n)* will not improve the overall time. For instance
replacing lists with arrays improves *f(n)* to constant from
linear time, but the asymptotic complexity is still *O(n ^{2.71})*.