CS4850    Spring 2013

Mathematical Foundations for the Information Age

M W F     1:25-2:15     Hollister Hall B14

 

Announcements        Concepts        Exams        Homework       Course Staff        Office Hours       

Announcements

Concepts: The course will cover mathematical foundation of modeling and searching of the WWW and other complex networks, discovering trends, data mining, and making recommendations based on user behavior. Topics will include large graphs, random structures, phase transitions, spectral methods, data in high dimensions, Chernoff bounds, generating functions, second order methods.

Textbook: Mathematics for the Information Age by Ravi Kannan and John Hopcroft

Exams: 

 

Homework:  Students are encouraged to work together.  The only requirement is that you understand everything you turn in. You are expected to type your solutions (i.e. no handwriting) and electronically submit to CMS.

Hw 1. Select one of 2.1, 2.2, 2.3, or 2.4 and "How would you estimate the mean of a random variable it its variance was not finite?

Hw 2. 2.9, 2.10, and 2.11

Hw 3. 2.15 and 2.16

Hw 4. 2.19, and 2.21 or 2.22

Hw 5. 2.29 and 2.47

Hw 6. 3.25 and 3.11

Hw 7. 3.12 and 3.14

Hw 8. 3.15 and 3.20

Hw 9. 3.18 and 3.19

Hw 10. 3.29, 3.31, and 3.39

Hw 11. 3.33 and 3.41

Hw 12. 3.57 and "Write a paragraph on what you have learned about random graphs."

Hw 13. 3.58, 3.60 and make a list of five items covered in the course that you would like help with.

Hw 14. 4.7 and 4.8

Hw 15. 4.14 and 4.25

Hw 16. 4.30 and 4.31 due Monday March 4

Hw 17. 5.2 and 5.5

Hw 18. 5.13 and 5.14

Hw 19. 5.4 and 5.9

Hw 20. 5.10 for graphs in Figure 5.12 not Figure 5.14 and problems 5.16 and 5.17. In Figure 5.10 delete one of the edges from u so that u is degree two.

Hw 21. 5.20 and 5.21

Hw 22. 5.25, 5.26, and 5.27

Hw 23. 5,33 and 5.48

Hw 24 5.50 and 5.52

Hw 25 (1) List ten important items that you have learned in this course. This exercise should be a good review for the mid term. (2) Find the left and right singular vectors, the singular values, and the SVD decomposition of the 4 by 2 matrix [0 2;2 0;1 3;3 1].

Hw 26 6.1 and 6.4

Hw 27 due Monday April 8 (1) 6.13 (2) List ten important consets from Chapter 5 and a couple of sentences about each concept.

Hw 28 Apply the perceptron learning algorithm to the following data

a1=[1,2] a2=[2,1] a3=[1,-1] a4=[-1,-1] l1=+1 l2=-1 l3=-1 l4=+1

to simplify computation do not normalize the ai to be unit vectors.

Plot the successive weight vectors without the threshold.

Hw 29 6.20 and 6.21 due April 15

Hw 30 6.22 due April 17

Hw 31 6.26 due April 19

Hw 32 6.24 and 6.25 due April 22

Hw 33 Write a paragraph explaining the VC-theorem and its application.

Hw 34 (1) 8.2 and (2) generte some data and apply the k-means clustering algorithm to it.

Hw 35 Due April 29 8.3 and 8.5

Hw 36 Due May 1 Create a set of data points that form three obvious clusters. Pick three starts points for kmeans so that the algorithm does not return the obvious clusters.l

Course Staff: Prof. John Hopcroft -- jeh@cs.cornell*
                          Ozan Irsoy, TA -- oirsoy@cs.cornell*
                          Konstantinos Mamouras, TA -- mamouras@cs.cornell*
                          Rachel Ehrlich, TA -- rae84@cornell*
                          Yipu Wang, TA -- yw294@cornell*

                            *add .edu after these addresses when sending email

Office Hours