Homework 1 Solutions

- a: 26 b: 600,000 c: 133,378,058

- Input is of length 3.5E+9, on an O(n
^{2}) algorithm, so it takes 1.225E+10 seconds, or over 388 years. Most people don't want to wait that long for a solution. On an O(n lg n) algorithm, it takes about 111 seconds, which is definitely feasible.

- a. Show (n choose k) = Theta(n
^{k}).

(n choose k) = [n * (n-1) * ... * (n-k+1)]/[k * (k-1) * ... * 1]

= [n/k]*[n-1/k-1]*...*[n-k+1/1] <n^{k}by inspection. Thus (n choose k) = O(n^{k})

To show (n choose k) is Omega(n^{k}), we need to show (n choose k) > (n/k)^{k}. From the expansion above, it suffices to show that [n-i/k-i] > n/k for 0 >= i >=k-1. Crossmultiplying, we get that we need to show nk-ik> nk-ni, or that ni-ki > 0. But this is clearly the case, since k<n, and i>= 0. Thus (n choose k) > (n/k)^{k}. Since k is a constant, this also means that (n choose k) = Omega(n^{k}). Since (n choose k) is now both Omega(n^{k}) and O(n^{k}), we have (n choose k) = Theta(n^{k}).

b. Show that lg n! = Theta(n lg n).

From Stirling's formula, we have n! = sqrt(2 Pi n)*(n/e)^{n}*(1+Theta(1/n)). Taking the log of both sides, we have lg n! = [1/2]*lg (2 Pi n) + n lg n - n lg e + lg (1+Theta(1/n)). We note that [1/2] lg c*n is o(n lg n), n lg e is o(n lg n), and for sufficiently large n, lg (1+Theta(1/n)) <lg (1+1)=O(1), which is o(n lg n). Thus the dominant term is n lg n, and thus the whole is Theta(n lg n).

c. Show log_{a}n = Theta(log_{b}n) for any positive constants a and b.

We simply note that log_{a}n = [log_{b}n/log_{b}a], and since log_{b}a is a constant, the two log's differ by a constant, which is exactly what Theta notation is designed to capture. Thus log_{a}n = Theta(log_{b}n).

- lg(lg n), lg n, (lg n)
^{2}, sqrt(n), n, n^{2}/lg n, n^{2}, 100^{sqrt n},2^{n}

- No. While subsitituting the Linear Search for the Binary search will find the correct
location in O(log N) time, it still takes linear time to shift all the following items
over to make space at that location. Thus, the algorithm would still be O(N
^{2}).

- Phase 1 : Sort the numbers in ascending order using a sorting algorithm with time
complexity Theta(n log n), such as mergesort.

Phase 2 : Suppose sorted numbers are store in an array A[1..n]. For each A[i], find a number X - A[i] in A[i..n] using binary search algorithm. If found, return Yes, otherwise No.

The first phase takes Theta(n log n) time. For the second phases, binary searching is done n times and it takes O(log n) each time. Thus time complexity for second phase is O(n log n). Combining together, the algorithm has Theta(n log n) time complexity.

- At n=15 100n
^{2}is faster than 2^{n}.

- Sum (1 to n) 1/k
^{2}is bounded above by a constant?

Sum (1 to n) 1/k^{2}= 1 + 1/4 + 1/9 + 1/16 + ...

<1 + integral (1 to infinity) 1/r^{2}dr

= 1 + [-1/r] (1 to infinity)

= 1 + [0+1]

= 2

So the sum is bounded above by the constant 2.

- We are given the recurrence T(n) = T(n/2) + Theta(1). The master method applies to
recurrences of the form T(n) = aT(n/b) + f(n), so this recurrence can be solved by the
master method. We note that for this case, a=1, b=2, and f(n)=Theta(1). For these values
of a and b, we note that log
_{b}a = log_{2}1 = 0, so n^{logba}= n^{0}= 1. We wish to show that we are in case 2 of the master theorem, or that f(n) = Theta(n^{logba}).

RHS = Theta(n^{logba}) = Theta(n^{0}) = Theta(1) = LHS. Thus we are in case 2. In this case, the master theorem tells us that T(n) = Theta(n^{logba}lg n), or that T(n) = Theta(lg n).

- Details of the data structure were covered in class. Basically, we maintain the
invariant that
`check[check_index[key]] == key`when the information in the dictionary for key is valid, and one of the indices will be out of range or have an incorrect value otherwise.// CS410 Fall 1999, Homework 1 Question 10 class InvalidKey extends Exception {} class DictionaryAbsent extends Exception {} class Dictionary { // Strictly speaking, in Java all data is initialised. // But assume that the arrays check, check_index, and data are not // initialised by the new expressions below. // The indices check[0] ... check[check_size-1] of check_index and data // are initialised, furthermore if check[i]=j (where 0 <= i < check_size) then // check_index[j]=i. // Therefore we can check if a given key is present by checking: // 0 <= check_index[key] < check_size and check[check_index[key]]==key private int check_size; private int check[]; private int check_index[]; private Object data[]; public Dictionary(int max_key) { check_size = 0; check = new int[max_key]; check_index = new int[max_key]; data = new Object[max_key]; } public boolean valid(int key) throws InvalidKey { // Check key is in range if (!(0 <= key && key < check.length)) throw new InvalidKey(); // Check if present int chk_ind = check_index[key]; return 0 <= chk_ind && chk_ind < check_size && check[chk_ind]==key; } public void insert(int key, Object value) throws InvalidKey { if (!valid(key)) { // Not present; initialise int chk_ind = check_size++; check[chk_ind] = key; check_index[key] = chk_ind; } // Set data data[key] = value; } public void delete(int key) throws InvalidKey { if (valid(key)) { // Present; delete it check_size--; int chk_ind = check_index[key]; if (chk_ind != check_size) { // Swap check entry for key with the top of stack check[chk_ind] = check[check_size]; check_index[check[check_size]] = chk_ind; } } // Otherwise do nothing } public Object lookup(int key) throws InvalidKey, DictionaryAbsent { if (valid(key)) return data[key]; // Present; return data else throw new DictionaryAbsent(); // Absent; throw exception } }