CS 6241: Numerical Methods for Data Science

Cornell University, Spring 2020.
Lectures: 2:55 PM–4:10 PM, Tuesdays and Thursdays, 203 Phillips Hall virtual on zoom.

Instructor: Austin Benson, Assistant Professor, Computer Science (arb@cs.cornell.edu).
Office hours: 4:15 PM–5:15 Thursdays (after class) or by appointment, 413B Gates virtual on zoom.

TA: Kangbo Li, PhD student, Computer Science (kl935@cornell.edu).
Office hours: 2:30–3:30pm Mondays, 405 Rhodes virtual on zoom.

Resources

Coursework schedule

Class schedule

This schedule is tentative and subject to change.
Week 1
Lecture 1 (Tu 1/21): overview and background
Topics: course overview, vector and matrix norms, linear systems, eigenvalues, basic matrix factorizations, basic optimization
Readings: Lecture notes: Lecture 2 (Th 1/23): linear least squares
Topics: solving linear least squares with matrix factorizations, statistical interpretations
Readings: Lecture notes:
Week 2
Lecture 3 (Tu 1/28): regularized linear least squares
Topics: Tikhonov regularization / ridge regression, Lasso, pivoted QR
Readings: Lecture notes: Lecture 4 (Th 1/30): iterative optimization
Topics: (stochastic) gradient descent, (quasi-)Newton
Readings: Lecture notes:
Week 3
Lecture 5 (Tu 2/4): iterative optimization and start of latent factor models
Topics: gradient descent with errors, more SGD, momentum, acceleration, basic ideas for latent factor models
Readings: Lecture notes: Lecture 6 (Th 2/6): latent factor models and linear dimensionality reduction
Topics: PCA, robust PCA, role of truncated SVD and proximal methods
Readings: Lecture notes:
Week 4 [HW1 due Th 2/13 at 11:59pm ET]
Lecture 7 (Tu 2/11): latent factor models, dimensionality reduction, and matrix factorizations
Topics: SVD-based latent factors and matrix completion
Readings: Lecture notes: Lecture 8 (Th 2/13): latent factor models, dimensionality reduction, and matrix factorizations
Topics: nonnegative matrix factorization (NMF)
Readings: Lecture notes:
Week 5
Lecture 9 (Tu 2/18): interpretable latent factor models and start of tensors
Topics: interpolative decomposition, CUR, basics on tensors, problems with border rank
Readings: Lecture notes: Lecture 10 (Th 2/20): low-rank tensor decompositions
Topics: more difficulties with tensors, CP and Tucker decompositions
Readings: Lecture notes:
Week 6
No lecture Tu 2/25 (February break)

Lecture 11 (Th 2/27): Nonlinear dimensionality reduction
Topics: ISOMAP, LLE, t-SNE
Readings: Lecture notes:
Week 7 [HW2 due Th 3/5 at 11:59pm ET]
Lecture 12 (Th 3/3): Basic network analysis
Topics: matrices associated with graphs, common network properties
Readings: Lecture notes: Lecture 13 (Th 3/5): Basic network analysis
Topics: common network properties, random graph models
Readings: Lecture notes:
Week 8
Lecture 14 (Th 3/10): Unsupervised learning on graphs
Topics: spectral methods for bipartitioning, ratio cut, normalized cut
Readings: Lecture notes: Lecture 15 (Th 3/12): Unsupervised learning on graphs
Topics: conductance, Cheeger's inequality, k-way spectral clustering
Readings: Lecture notes:
Week 9
No lecture Tu 3/17 or Th 3/19 (Cornell classes suspended due to covid-19)

Week 10
No lecture Tu 3/24 or Th 3/26 (Cornell classes suspended due to covid-19)

Week 11
No lecture Tu 3/31 or Th 4/2 (spring break)

Week 12 [Reaction paper due Th 4/9 at 11:59pm ET]
Lecture 16 (Tu 4/7): Graph-based semi-supervised learning [VIRTUAL SYNCHRONOUS LECTURE]
Topics: classical graph-based semi-supervised techniques with connections to random walks and spectral clustering
Readings: Lecture notes: Lecture 17 (Th 4/9): Node representation learning in graphs [VIRTUAL SYNCHRONOUS LECTURE]
Topics: latent space models, role discovery, node embeddings
Readings: Lecture notes:
Week 13
Lecture 18 (Th 4/14): Graph neural networks [VIRTUAL SYNCHRONOUS LECTURE]
Topics: high-level ideas, basic derivation of graph convolutional networks
Readings: Lecture notes: Lecture 19 (Th 4/16): Graph neural networks [VIRTUAL SYNCHRONOUS LECTURE]
Topics: more general architectures, other prediction tasks
Readings: Lecture notes:
Week 14 [Project proposal due Th 4/23 at 11:59pm ET]
"Lecture" 20 (Tu 4/21): small subgraph patterns [READ ON YOUR OWN]
Topics: network motifs and network structure
Readings: "Lecture" 21 (Th 4/23): small subgraph patterns [VIRTUAL SYNCHRONOUS DISCUSSION]
Topics: discussion on network motifs and network structure, subgraph counting algorithms
Readings: Lecture notes:
Week 15
Lecture 22 (Tu 4/28): kernels [VIRTUAL SYNCHRONOUS LECTURE]
Topics: feature maps, kernel trick, function approximation, RKHSs
Readings: Lecture notes: Lecture 23 (Th 4/30): kernels [VIRTUAL SYNCHRONOUS LECTURE]
Topics: RKHSs, native spaces, Moore–Aronszajn, Gaussian Processes, GP likelihood and hyperparameter optimization
Readings: Lecture notes:
Week 16 [Progres report due Th 5/7 at 11:59pm ET]
"Lecture" 24 (Tu 5/5): Gaussian Processes [READ ON YOUR OWN]
Topics: fast computation for hyperparameter optimization
Readings: "Lecture" 25 (Th 5/7): Gaussian Processes [VIRTUAL SYNCHRONOUS DISCUSSION]
Topics: fast computation for hyperparameter optimization
Readings: Lecture notes
Week 17
Project feedback sessions (Mon 5/11 and Tu 5/12)
Week 18 [Final project report due Th 5/21 at 11:59pm ET]

Coursework

Coursework will be managed through and assignments submitted on CMS. The required coursework consists of three components:
  • Homework (20%)
    There are two homeworks that will have a theoretical component and/or an implementation with data analysis component.
  • Reaction paper (20%)
    The second component of the course is composing a (roughly) 5-page reaction paper that summarizes and critiques at least two closely related published papers relevant to the class. The motivation of this assignment is to get everyone thinking about research issues related to the class material and to stimulate ideas for the course project (described below). You can work individually or in groups of two or three. Your group members can be the same as your project partners, but they do not have to be. There may be a group presentation involved. Additional details will be posted soon.
  • Course project (60%)
    Most of your grade will be determined by a course project, where the topic is of your choosing but must be related to the class. You can work individually or in groups of two or three (the scope of the project should be scaled according to the group size). We will be looking for three key pieces in the course project: (i) some theoretical or mathematical discussion of a numerical method or algorithm (broadly construed); (ii) some software implementation of a method or algorithm; and (iii) some analysis of a real-world dataset. The project will be split into three components: a proposal, a progress report, and a final report, worth 10%, 15%, and 35% of the final grade, respectively. More details will be posted soon.