Syllabus for CS4787/5777

Principles of Large-Scale Machine Learning — Fall 2023

Term Fall 2023 Instructor Christopher De Sa
Course website www.cs.cornell.edu/courses/cs4787/2023fa/ E-mail [email hidden]
Schedule MW 7:30-8:45PM Office hours Wednesdays 2PM
Room Phillips Hall 101 Office Gates 426
Term Fall 2023
Instructor Christopher De Sa
Course website www.cs.cornell.edu/courses/cs4787/2023fa/
E-mail [email hidden]
Schedule MW 7:30-8:45PM
Office hours Wednesdays 2PM
Room Phillips Hall 101
Office Gates 426

Description: CS4787 explores the principles behind scalable machine learning systems. The course will cover the algorithmic and the implementation principles that power the current generation of machine learning on big data. We will cover training and inference for both traditional ML algorithms such as linear and logistic regression, as well as deep models. Topics will include: estimating statistics of data quickly with subsampling, stochastic gradient descent and other scalable optimization methods, mini-batch training, accelerated methods, adaptive learning rates, methods for scalable deep learning, hyperparameter optimization, parallel and distributed training, and quantization and model compression.

Prerequisites: CS4780 or equivalent, CS 2110 or equivalent

Format: Lectures during the scheduled lecture period will cover the course content. Problem sets will be used to encourage familiarity with the content and develop competence with the more mathematical aspects of the course. Programming assignments will help build intuition and familiarity with how machine learning algorithms run. There will be one midterm exam and one final exam, each of which will test both theoretical knowledge and programmming implementation of concepts.

Material: The course is based on books, papers, and other texts in machine learning, scalable optimization, and systems. Texts will be provided ahead of time on the website on a per-lecture basis. You aren't expected to necessarily read the texts, but they will provide useful background for the material we are discussing.

Grading: Students taking CS4787 will be evaluated on the following basis.

20% Problem sets
40% Programming assignments
15% Prelim Exam
25% Final Exam

CS5777 has an additional paper-reading component, and students taking CS5777 will be evaluated as follows.

15% Problem sets
35% Programming assignments
10% Paper reading
15% Prelim Exam
25% Final Exam

Inclusiveness: You should expect and demand to be treated by your classmates and the course staff with respect. You belong here, and we are here to help you learn—and enjoy—this course. If any incident occurs that challenges this commitment to a supportive and inclusive environment, please let the instructor know so that we can address the issue. We are personally committed to this, and subscribe to the Computer Science Department's Values of Inclusion.


Course calendar may be subject to change.

Course Calendar Plan

Monday, August 21
Aug
20
Aug
21
Aug
22
Aug
23
Aug
24
Aug
25
Aug
26
Monday, August 21
Lecture 1. Introduction and course overview. [Notes PDF]

Problem Set 1 Released. [Notebook] [HTML]
Wednesday, August 23
Aug
20
Aug
21
Aug
22
Aug
23
Aug
24
Aug
25
Aug
26
Wednesday, August 23
Lecture 2. Linear algebra done efficiently: Mapping mathematics to numpy. ML via efficient kernels linked together in python. [Notebook] [HTML]

Background reading material:
Monday, August 28
Aug
27
Aug
28
Aug
29
Aug
30
Aug
31
Sep
1
Sep
2
Monday, August 28
Lecture 3. Software for learning with gradients. Numerical differentiation, symbolic differentiation, and automatic differentiation. [Notebook] [HTML]
Wednesday, August 30
Aug
27
Aug
28
Aug
29
Aug
30
Aug
31
Sep
1
Sep
2
Wednesday, August 30
Lecture 4. Efficient gradients with backpropagation. [Notebook] [HTML]

Background reading material:

Programming Assignment 1 Released. [Instructions] [Starter Code]
Monday, September 4
Sep
3
Sep
4
Sep
5
Sep
6
Sep
7
Sep
8
Sep
9
Monday, September 4
Labor Day. No Lecture.
Wednesday, September 6
Sep
3
Sep
4
Sep
5
Sep
6
Sep
7
Sep
8
Sep
9
Wednesday, September 6
Lecture 5. Machine learning frameworks. [Notebook] [HTML]

Background reading material:

Problem Set 1 Due.
Monday, September 11
Sep
10
Sep
11
Sep
12
Sep
13
Sep
14
Sep
15
Sep
16
Monday, September 11
Lecture 6. Scaling to complex models by learning with optimization algorithms. Learning in the underparameterized regime. Gradient descent, convex optimization and conditioning. [Notebook] [HTML] [Notes PDF]

Background reading material:

Problem Set 2 Released. [PDF]
Wednesday, September 13
Sep
10
Sep
11
Sep
12
Sep
13
Sep
14
Sep
15
Sep
16
Wednesday, September 13
Lecture 7. Gradient descent continued. Stochastic gradient descent. [Notebook] [HTML] [Notes PDF]

Background reading material:

Programming Assignment 1 Due.
Monday, September 18
Sep
17
Sep
18
Sep
19
Sep
20
Sep
21
Sep
22
Sep
23
Monday, September 18
Lecture 8. Adapting algorithms to hardware. Minibatching and the effect of the learning rate. Our first hyperparameters. [Notebook] [HTML]

Background reading material:
Wednesday, September 20
Sep
17
Sep
18
Sep
19
Sep
20
Sep
21
Sep
22
Sep
23
Wednesday, September 20
Lecture 9. Optimization techniques for efficient ML. Accelerating SGD with momentum. [Notebook] [HTML] [Notes PDF]

Background reading material:

Programming Assignment 2 Released. [Instructions] [Starter Code]

Paper Reading 1 Released. [Instructions]
Monday, September 25
Sep
24
Sep
25
Sep
26
Sep
27
Sep
28
Sep
29
Sep
30
Monday, September 25
Lecture 10. Optimization techniques for efficient ML, continued. Accelerating SGD with preconditioning and adaptive learning rates. [Notebook] [HTML] [Notes PDF]

Background reading material:
Wednesday, September 27
Sep
24
Sep
25
Sep
26
Sep
27
Sep
28
Sep
29
Sep
30
Wednesday, September 27
Lecture 11. Optimization techniques for efficient ML, continued. Accelerating SGD with variance reduction and averaging. [Notebook] [HTML] [Notes PDF]

Background reading material:

Problem Set 2 Due.
Monday, October 2
Oct
1
Oct
2
Oct
3
Oct
4
Oct
5
Oct
6
Oct
7
Monday, October 2
Lecture 12. Sparsity and dimension reduction. [Notebook] [HTML] [Demo Notebook] [Demo HTML] [Notes PDF]

Background reading material:

Problem Set 3 Released. [PDF]

Paper Reading 2 Released. [Instructions]
Wednesday, October 4
Oct
1
Oct
2
Oct
3
Oct
4
Oct
5
Oct
6
Oct
7
Wednesday, October 4
Lecture 13. Deep neural networks review. The overparameterized regime and how it affects optimization. Matrix multiply as computational core of learning. [Demo Notebook] [Demo PDF] [Notes PDF]

Background reading material:

Programming Assignment 2 Due.

Paper Reading 1 Due.
Monday, October 9
Oct
8
Oct
9
Oct
10
Oct
11
Oct
12
Oct
13
Oct
14
Monday, October 9
Indigenous Peoples' Day. No Lecture.
Wednesday, October 11
Oct
8
Oct
9
Oct
10
Oct
11
Oct
12
Oct
13
Oct
14
Wednesday, October 11
Lecture 14. Deep neural networks review continued. Transformers and sequence models. [Notes PDF]

Background reading material:
Monday, October 16
Oct
15
Oct
16
Oct
17
Oct
18
Oct
19
Oct
20
Oct
21
Monday, October 16
Lecture 15. Methods to Accelerate DNN training. Early stopping. Batch/layer normalization. [Notes PDF]

Background reading material:
Wednesday, October 18
Oct
15
Oct
16
Oct
17
Oct
18
Oct
19
Oct
20
Oct
21
Wednesday, October 18
Lecture 16. Kernels and kernel feature extraction. [Notebook] [HTML] [Notes PDF]

Background reading material:

Problem Set 3 Due.

Paper Reading 2 Due.
Monday, October 23
Oct
22
Oct
23
Oct
24
Oct
25
Oct
26
Oct
27
Oct
28
Monday, October 23
Lecture 17. Kernels continued, and Hyperparameter Optimization Recap. [Notebook] [HTML] [Notes PDF]

Background reading material:

Programming Assignment 3 Released. [Instructions] [Starter Code]
Wednesday, October 25
Oct
22
Oct
23
Oct
24
Oct
25
Oct
26
Oct
27
Oct
28
Wednesday, October 25
Lecture 18. Gaussian Processes and Bayesian Optimization. [Notes PDF]

Background reading material:
Monday, October 30
Oct
29
Oct
30
Oct
31
Nov
1
Nov
2
Nov
3
Nov
4
Monday, October 30
Lecture 19. Bayesian optimization continued, and prelim review.
Tuesday, October 31
Oct
29
Oct
30
Oct
31
Nov
1
Nov
2
Nov
3
Nov
4
Tuesday, October 31
Prelim Exam. 7:30PM, OLH155.
Wednesday, November 1
Oct
29
Oct
30
Oct
31
Nov
1
Nov
2
Nov
3
Nov
4
Wednesday, November 1
Lecture 20. Bayesian optimization continued...maybe start parallelism?
Monday, November 6
Nov
5
Nov
6
Nov
7
Nov
8
Nov
9
Nov
10
Nov
11
Monday, November 6
Remove over Zoom— Lecture 21. Parallelism. [Notebook] [HTML] [Demo Notebook] [Demo HTML] [Notes PDF]

Background reading material:
  • Good resource on parallel programming, particularly on GPUs: Chapter 1 of Programming Massively Parallel Processors: A Hands-On Approach, Second Edition (by David B. Kirk and Wen-mei W. Hwu). This book is available on the Cornell library.
  • Classical work providing background on parallelism in computer architecture: Chapters 3, 4, and 5 of Computer Architecture: A Quantitative Approach. This book is available on the Cornell library.

Programming Assignment 3 Due.
Wednesday, November 8
Nov
5
Nov
6
Nov
7
Nov
8
Nov
9
Nov
10
Nov
11
Wednesday, November 8
Lecture 22. Memory locality and memory bandwidth. [Notebook] [HTML] [Notes PDF]
Monday, November 13
Nov
12
Nov
13
Nov
14
Nov
15
Nov
16
Nov
17
Nov
18
Monday, November 13
Lecture 23. Floating-point arithmetic. Quantized, low-precision machine learning. [Notes PDF] [Slides PDF]

Background reading material:
  • A classic example of a blog post illustrating the use of low-precision arithmetic for deep learning.

Programming Assignment 4 Released. [Instructions] [Starter Code]

Paper Reading 3 Released. [Instructions]

Problem Set 4 Released. [PDF]
Wednesday, November 15
Nov
12
Nov
13
Nov
14
Nov
15
Nov
16
Nov
17
Nov
18
Wednesday, November 15
Lecture 24. Distributed learning and the parameter server. [Notes PDF] [Slides PDF]

Background reading material:
Monday, November 20
Nov
19
Nov
20
Nov
21
Nov
22
Nov
23
Nov
24
Nov
25
Monday, November 20
Lecture 25. Machine learning on GPUs. ML Accelerators. [Notes PDF] [Slides PDF]

Background reading material:
  • Parallel programming on GPUs: Chapters 2-5 of Programming Massively Parallel Processors: A Hands-On Approach, Second Edition (by David B. Kirk and Wen-mei W. Hwu). This book is available on the Cornell library.
  • The original TPU paper In-datacenter performance analysis of a tensor processing unit ISCA, 2017.

Programming Assignment 5 Released. [Instructions] [Starter Code]
Wednesday, November 22
Nov
19
Nov
20
Nov
21
Nov
22
Nov
23
Nov
24
Nov
25
Wednesday, November 22
Thanksgiving Break. No Lecture.
Monday, November 27
Nov
26
Nov
27
Nov
28
Nov
29
Nov
30
Dec
1
Dec
2
Monday, November 27
Lecture 26. Deployment and low-latency inference. Real-time learning. Deep neural network compression and pruning. [Notes PDF] [Slides PDF]

Background reading material:

Paper Reading 3 Due.
Wednesday, November 29
Nov
26
Nov
27
Nov
28
Nov
29
Nov
30
Dec
1
Dec
2
Wednesday, November 29
Lecture 27. Online learning. Foundation Models. Transfer Learning. In-context learning. Fine-tuning. [Notes PDF]

Programming Assignment 4 Due.
Monday, December 4
Dec
3
Dec
4
Dec
5
Dec
6
Dec
7
Dec
8
Dec
9
Monday, December 4
Lecture 28. The future of machine learning. Competitors to the transformer. [Notes PDF]

Problem Set 4 Due.

Programming Assignment 5 Due.