CS 410 Fall 99
Homework 1 Solutions

  1. a: 26 b: 600,000 c: 133,378,058
  2. Input is of length 3.5E+9, on an O(n2) algorithm, so it takes 1.225E+10 seconds, or over 388 years. Most people don't want to wait that long for a solution. On an O(n lg n) algorithm, it takes about 111 seconds, which is definitely feasible.
  3. a. Show (n choose k) = Theta(nk).
    (n choose k) = [n * (n-1) * ... * (n-k+1)]/[k * (k-1) * ... * 1]
    = [n/k]*[n-1/k-1]*...*[n-k+1/1] <nk by inspection. Thus (n choose k) = O(nk)
    To show (n choose k) is Omega(nk), we need to show (n choose k) > (n/k)k. From the expansion above, it suffices to show that [n-i/k-i] > n/k for 0 >= i >=k-1. Crossmultiplying, we get that we need to show nk-ik> nk-ni, or that ni-ki > 0. But this is clearly the case, since k<n, and i>= 0. Thus (n choose k) > (n/k)k. Since k is a constant, this also means that (n choose k) = Omega(nk). Since (n choose k) is now both Omega(nk) and O(nk), we have (n choose k) = Theta(nk).

    b. Show that lg n! = Theta(n lg n).
    From Stirling's formula, we have n! = sqrt(2 Pi n)*(n/e)n*(1+Theta(1/n)). Taking the log of both sides, we have lg n! = [1/2]*lg (2 Pi n) + n lg n - n lg e + lg (1+Theta(1/n)). We note that [1/2] lg c*n is o(n lg n), n lg e is o(n lg n), and for sufficiently large n, lg (1+Theta(1/n)) <lg (1+1)=O(1), which is o(n lg n). Thus the dominant term is n lg n, and thus the whole is Theta(n lg n).

    c. Show logan = Theta(logbn) for any positive constants a and b.
    We simply note that logan = [logbn/logba], and since logba is a constant, the two log's differ by a constant, which is exactly what Theta notation is designed to capture. Thus logan = Theta(logbn).
  4. lg(lg n), lg n, (lg n)2, sqrt(n), n, n2/lg n, n2, 100sqrt n ,2n
  5. No. While subsitituting the Linear Search for the Binary search will find the correct location in O(log N) time, it still takes linear time to shift all the following items over to make space at that location. Thus, the algorithm would still be O(N2).
  6. Phase 1 : Sort the numbers in ascending order using a sorting algorithm with time complexity Theta(n log n), such as mergesort.
    Phase 2 : Suppose sorted numbers are store in an array A[1..n]. For each A[i], find a number X - A[i] in A[i..n] using binary search algorithm. If found, return Yes, otherwise No.
    The first phase takes Theta(n log n) time. For the second phases, binary searching is done n times and it takes O(log n) each time. Thus time complexity for second phase is O(n log n). Combining together, the algorithm has Theta(n log n) time complexity.
  7. At n=15 100n2 is faster than 2n.
  8. Sum (1 to n) 1/k2 is bounded above by a constant?
    Sum (1 to n) 1/k2 = 1 + 1/4 + 1/9 + 1/16 + ...
    <1 + integral (1 to infinity) 1/r2 dr
    = 1 + [-1/r] (1 to infinity)
    = 1 + [0+1]
    = 2
    So the sum is bounded above by the constant 2.
  9. We are given the recurrence T(n) = T(n/2) + Theta(1). The master method applies to recurrences of the form T(n) = aT(n/b) + f(n), so this recurrence can be solved by the master method. We note that for this case, a=1, b=2, and f(n)=Theta(1). For these values of a and b, we note that logba = log21 = 0, so nlogba = n0 = 1. We wish to show that we are in case 2 of the master theorem, or that f(n) = Theta(nlogba).
    RHS = Theta(nlogba) = Theta(n0) = Theta(1) = LHS. Thus we are in case 2. In this case, the master theorem tells us that T(n) = Theta(nlogba lg n), or that T(n) = Theta(lg n).
  10. Details of the data structure were covered in class. Basically, we maintain the invariant that check[check_index[key]] == key when the information in the dictionary for key is valid, and one of the indices will be out of range or have an incorrect value otherwise.
    // CS410 Fall 1999, Homework 1 Question 10
    class InvalidKey extends Exception {}
    class DictionaryAbsent extends Exception {}
    class Dictionary {
        // Strictly speaking, in Java all data is initialised.
        // But assume that the arrays check, check_index, and data are not
        // initialised by the new expressions below.
        // The indices check[0] ... check[check_size-1] of check_index and data
        // are initialised, furthermore if check[i]=j (where 0 <= i < check_size) then
        // check_index[j]=i.
        // Therefore we can check if a given key is present by checking:
        //    0 <= check_index[key] < check_size and check[check_index[key]]==key
        private int check_size;
        private int check[];
        private int check_index[];
        private Object data[];
        public Dictionary(int max_key) {
            check_size = 0;
            check = new int[max_key];
            check_index = new int[max_key];
            data = new Object[max_key];
        public boolean valid(int key) throws InvalidKey {
            // Check key is in range
            if (!(0 <= key && key < check.length)) throw new InvalidKey();
            // Check if present
            int chk_ind = check_index[key];
            return 0 <= chk_ind && chk_ind < check_size && check[chk_ind]==key;
        public void insert(int key, Object value) throws InvalidKey {
            if (!valid(key)) {
                // Not present; initialise
                int chk_ind = check_size++;
                check[chk_ind] = key;
                check_index[key] = chk_ind;
            // Set data
            data[key] = value;
        public void delete(int key) throws InvalidKey {
            if (valid(key)) {
                // Present; delete it
                int chk_ind = check_index[key];
                if (chk_ind != check_size) {
                    // Swap check entry for key with the top of stack
                    check[chk_ind] = check[check_size];
                    check_index[check[check_size]] = chk_ind;
            } // Otherwise do nothing
        public Object lookup(int key) throws InvalidKey, DictionaryAbsent {
            if (valid(key))
                return data[key];              // Present; return data
                throw new DictionaryAbsent();  // Absent; throw exception