# Efficiency
Today we'll review the definition of Big-Oh, do some proofs with it,
and learn about *recurrence relations*.
## Big-Oh
Recall the definition of Big-Oh:
$$O(g) = \\{ f | \exists c > 0, n_0 > 0 . \forall n \geq n_0 . f(n) \leq cg(n) \\}$$
##### Exercise: squared [✭]
Prove that $(\\mathit{fun}~n \rightarrow 2n^2)
\in O(\\mathit{fun}~n \rightarrow n^3)$.
□
Here is an example proof that $(\\mathit{fun}~n \rightarrow n)
\in O(\\mathit{fun}~n \rightarrow 2^n)$, or as some authors
might write, that "$n$ is $O(2^n)$":
**Proof:** Let $c = 1$ and let $n_0 = 1$. We must show that for all $n \geq n_0$,
it holds that $n \leq 2^n$. We do so by induction on $n$, which you will recall
from CS 2800:
* Base case, $n = 1$: Substituting $1$ for $n$ in $n \leq 2^n$,
we have that $1 \leq 2$, which is true.
* Inductive case, an arbitrary $n \geq 1$:
- Inductive hypothesis: Assume $n \leq 2^n$.
- Show: $n+1 \leq 2^{n+1}$.
- Proof: Adding $1$ to each side of the inductive hypothesis, we have
that $n + 1 \leq 2^n + 1$. Focusing on the right-hand side of that
inequality, we have that $2^n + 1 \leq 2^n + 2^n$, because $1 \leq
2^n$ when (as is the case here) $n \geq 0$. So we have that $2^n + 1
\leq 2^n + 2^n$. Once more focusing on the right-hand side, we can
rewrite $2^n + 2^n$ as $2\cdot2^n$, which is $2^{n+1}$. Therefore we
have $n+1 \leq 2^{n+1}$, which is what we needed to show.
□
##### Exercise: logged [✭✭]
Prove that $(\\mathit{fun}~n \rightarrow \log n)
\in O(\\mathit{fun}~n \rightarrow n)$.
*Hint: You will need to do a subproof
involving logs; of the many ways of doing that subproof, one way would
be to follow the example above.*
□
## Running time
Let's make the following assumption: there is some constant $S$ such
that every step in the small-step substitution model semantics takes
time less than or equal to $S$. We are effectively assuming that
performing a primitive arithmetic operation (like `+`), forming a tuple,
record, or variant from a given value, branching on a value in an `if`
or `match` expression, and applying a function (but not evaluating it
all the way to a value) are all constant-time operations.
The expression `4 * (3+5)` would then take time at most $2S$ to run,
because it takes two steps in the small-step semantics.
##### Exercise: running time [✭✭]
Give an upper bound on the running time of each expression in terms of $S$.
- `let x = 3 in x + x`
- `let x = 3 + 3 in x + x`
- `let f x = x + x in f (f 3)`
□
Absent loops or recursion, functions will always run in constant time.
But consider the following recursive function:
```
let rec countdown n =
if n = 0 then
()
else
countdown (n-1)
```
Let $T(n)$ be the time it takes to run `countdown` on input $n$.
We can directly compute a bound on $T(0)$:
```
countdown 0
--> if 0 = 0 then () else countdown (n-1)
--> if true then () else countdown (n-1)
--> ()
```
Since `countdown 0` takes three small steps, $T(0) \leq 3S$.
But what about $T(n)$ for $n > 0$? We can use the substitution model:
```
countdown n
--> if n = 0 then () else countdown (n-1)
--> if false then () else countdown (n-1)
--> countdown (n-1)
--> countdown n'
-->* v
```
where $n' = n - 1$. So it takes 4 steps to evaluate to `countdown n'`.
It then takes $T(n') = T(n-1)$ steps to evaluate `countdown n'` to a
value. Therefore $T(n) \leq 4S + T(n-1)$. This kind of formula (relating
the value of a function on different inputs) is called a **recurrence
relation**.
In the worst case, where every step takes exactly $S$ time, $T(n) = 4S + T(n-1)$.
A direct formula for $T$ would then be $T(n) = 4nS + 3S$. Since $S$ is
a constant, we have that $T(n) \in O(\\mathit{fun}~n \rightarrow n)$.
Thus, `countdown` runs in linear time.
##### Exercise: list recurrence [✭✭]
Write down a recurrence relation bounding the running time of the
following functions on an input list of length $n$:
- `let rec length lst = match lst with [] -> 0 | _::t -> 1 + length t`
- `let rec map f lst = match lst with [] -> [] | h::t -> (f h)::(map f t)` (assume
that `f` takes at most $U$ time on any input, where $U$ is a constant)
What is the running time of each using Big-Oh notation?
□
## Additional exercises
##### Exercise: transitive [✭✭✭]
Prove that $O$ is *transitive*, that is, if $f$ is $O(g)$
and $g$ is $O(h)$, then $f$ is $O(h)$.
□
##### Exercise: multiply [✭✭✭✭]
Prove that if $f$ is $O(f')$ and $g$ is $O(g')$ then $f \cdot g$ is
$O(f' \cdot g')$. Multiplication of functions is defined as follows:
$(f \cdot g)(x) = f(x) \cdot g(x)$.
##### Exercise: int recurrence [✭✭✭]
Write a recurrence relation bounding the running time of the
following functions on input $n$:
- `let rec fact n = if n <= 1 then 1 else n * fact (n-1)`
- `let rec bits n = if n = 0 then 0 else 1 + bits (n/2)`
The running time of the former in Big-Oh notation is
easy to determine; the latter might be a little more difficult.
*Hint:* try expanding the formula for some sample inputs.
□
##### Exercise: fib [✭✭✭✭]
Write a recurrence relation bounding the running time of the
following function on input $n$:
```
let rec fib n =
if n = 0 || n = 1 then 1
else fib (n-1) + fib (n-2)
```
Can you figure out how to solve that recurrence relation and determine
the running time of `fib` in Big-Oh notation? These [old 3110 notes][oldnotes]
have an answer.
[oldnotes]: http://www.cs.cornell.edu/Courses/cs3110/2014fa/recitations/24/rec24.html
□