CS4786/CS5786 Placement Exam

(Spring 2020) Solutions are due JANUARY 31st. Submit via Google forms at here

Important Note: This is a placement exam that also serves as a self diagnostic take home assignment. You have to pass this test to enroll in the class. The points you score on this placement exam are not counted towards your grade on the course. You will need to submit your solution by 31st January via the google form link provided above. The feedback we will provide (along with answer keys) will guide you in terms of which areas you might want to brush up to prepare for the course better. You are to do this problem set individually and make an honest attempt. Even if you are not able to complete a problem, write down your thoughts and whatever attempt you made. This will help us give you more detailed feedback.

I. Probability and Statistics

The following two questions test your familiarity with basic probability and statistics.
1. (5 pts) Ms. Y lives next to a house with a big lawn. On any given day, the probability that it rains in Ms. Y's neighborhood is $0.1$. The probability that the sprinkler is turned on next door is $0.3$. When it rains, the probability that Ms. Y wears a poncho is $0.6$. When the sprinkler is on, the probability that Ms. Y wears a poncho is $0.2$. She never wears a poncho unless it either rained or the sprinkler was on. You meet Ms. Y wearing a poncho. Is it more likely that the sprinkler was turned on that day or that it rained in Ms. Y's neighborhood that day? Show your working.

2. (10 pts) For a continuous random variable $X$ with density function given by function $p(x)$, the differential entropy (a generalization of entropy in the discrete case) is defined as: $$H(p) = - \int p(x) \ln(p(x)) dx$$ Consider the gaussian random variable $X$ with mean $\mu$ and variance $\sigma$ (c.f. here ). Write down the differential entropy for this gaussian random variable in terms of its parameters. Show your complete derivation.

3. (5 pts) In a group of $100$ sports fans, $90$ are football fans and $10$ are baseball fans. You have with you a box containing $90$ football jerseys and $10$ baseball jerseys. You distribute one jersey to each fan from this box in a random order. What is the expected number of baseball fans who get baseball jerseys? Show your work. (Hint: you can do an exact counting argument but this is cumbersome, try using linearity of expectation instead.)

II. Linear Algebra

The following two questions test your familiarity with linear algebra.
1. (5 pts) Let $\mathbf{x}$ and $\mathbf{y}$ be two non-zero $d$-dimensional vectors that are orthogonal (i.e., $\mathbf{x}^\top \mathbf{y} = 0$). Does there exist an $d \times d$ matrix $A$, such that the vectors $A \mathbf{x}$ and $A\mathbf{y}$ are not orthogonal? Prove your answer, showing all work.

2. (25 pts) Let $M$ be a $d \times d$ square matrix that can be written as a sum of $m$ outer-products, that is, $M = \sum_{i=1}^m \mathbf{m}_i \mathbf{m}_i^\top$ where each $\mathbf{m}_i$ is a $d$ dimensional vector. Show that the matrix $M$ is positive semidefinite . Show your working.

III. Optimization

The following two questions test your familiarity with the method of Lagrange multiplier to perform constrained optimization.
1. (25 pts) Let $s_1,\ldots,s_d$ are $d$ positive numbers. Consider the function $f: \mathbb{R}^d \mapsto \mathbb{R}$ given by:
2. \begin{align} \notag f(\mathbf{x}) &= \sum_{i=1}^d s_i \log\left(\mathbf{x}[i]\right) \end{align} what is the maxima of the above function when the vector $\mathbf{x}$ is constrained to be on the $d$ dimensional simplex, that is, $\mathbf{x}$ is a valid probability distribution over $d$ items in the set, $\{\mathbf{x}: \forall i \in [d], \mathbf{x}[i] \ge 0\ \&\ \sum_{i=1}^d \mathbf{x}[i] = 1\}$?$~$ Please show the steps for your solution.

3. (25 pts) The Euclidean norm of a $d$ dimensional vector $\mathbf{x}$, is defined as $\|\mathbf{x}\|_2 = \sqrt{\sum_{i=1}^d \mathbf{x}[i]^2}$. Given a $d \times d$ square matrix $M$, we are interested in the following optimization problem: \begin{align*} \mathrm{Maximize~~} & x^\top M x\\ \mathrm{subject~to~~}& \|\mathbf{x}\|_2^2 = 1 \end{align*} (Hint: this question will use a bit of linear algebra)

1. What is the maxima for the above optimization problem (in terms of matrix $M$), show your steps.

2. What is the maximum value of the above optimization problem.