EECS 6002 Machine Learning Theory
MW 11:30am1pm, BRG 211
Course description
This course takes a foundational perspective on machine learning and covers some of its underlying mathematical principles. Topics range from wellestablished results in learning theory to current research challenges. We start with introducing a formal framework, and then introduce and analyze learning methods, such as Nearest Neighbors, Boosting, SVMs and Neural Networks. Finally, students present and discuss recent research papers.
Announcements
December 4
Here are some instructions for your paper report.
November 1
We have a review session for the exam on Friday, November 3, in Bergeron 217.
October 29
 As we agreed in class, the third set of exercises will be discussed on Wednesday, November 1.
 The exam will be on November 6, 11:3012:50. Make sure to arrive on time!
 There will be a Q&A session on Friday, November 3 at 5pm. Room tba.
 Here are some practice exam question.
October 24
The third set of exercises for practice is out. Solutions will be discussed in class on October 30.
October 2
The second set of exercises for practice is out. Solutions will be discussed in class on October 11.
September 22
The first set of exercises for practice is out. Solutions will be discussed in class on October 2.
Lectures
 September 11, Lecture 1 Organization, Introduction
 September 13, Lecture 2 Probability recap, learning rectangle classifiers
Chapter 2
 September 18, Lecture 3 Learnability of finite classes in the realizable case
Chapter 2
 September 20, Lecture 4 Learnability of finite classes in the agnostice case case; Definition of PAC Learnability; Definition of VCdimension
Chapter 3; Section 4.2; Section 6.2
 September 25, Lecture 5 Uniform convergence, fundamental theorem of PAC learning, linear classifiers, VCdim of homogeneous linear classifiers
Section 4.1; Section 6.4; Chapter 9 intro; Section 9.1.3
 September 27, Lecture 6 Uniform convergence implies learnability, VCdim of general linear classifiers, Perceptron
Section 4.1; Section 9.1.3
 October 2, Lecture 7 Solutions to first set of exercises
 October 4, Lecture 8 Perceptron
Section 9.1.2
 October 11, Lecture 9 Solutions to second set ot exercises, SVM
Section 15.1
 October 16, Lecture 10 SVM, convex functions, surrogate loss functions
Section 15.2, Section 15.2.1, Section 12.1.1, Section 12.3
 October 18, Lecture 11 Gradient descent and Stochastic gradient descent
Section 14.1, Section 14.2, Section 14.3 (without analysis), Section 14.5.1
 October 23, Lecture 12 Revision of losses, Stochastic gradient descent, SGD for SVM
Section 14.4.4, Section 14.5.3, Section 15.5
 October 25, Lecture 13 Neural networks, definition, VCdim
Chapter 20 up to (and including) section 20.4
 October 30, Lecture 14 SGD for neural networks
Section 20.6
 November 1, Lecture 15 Solutions to third set of exercises
 November 6, Lecture 16 Midterm
 November 8, Lecture 17 Discussion of Solutions to Midterm
 November 13, Lecture 18 No free lunch
 November 15, Lecture 19 Student presentations
 November 20, Lecture 20 Student presentations
 November 22, Lecture 21 Student presentations
 November 27, Lecture 22 Student presentations
 November 29, Lecture 23 Student presentations
 December 4, Lecture 24 Student presentation
Exercises
Literature
Potential project papers
Fairness in Machine Learning

Cynthia Dwork, Moritz Hardt, Toniann Pitassi, Omer Reingold, Rich Zemel:
Fairness Through Awareness. ArXiv 2011, ITCS 2011.

Moritz Hardt, Eric Price, Nathan Srebro:
Equality of Opportunity in Supervised Learning. ArXiv 2016, NIPS 2016.

Niki Kilbertus, Mateo RojasCarulla, Giambattista Parascandolo, Moritz Hardt, Dominik Janzing, Bernhard Schölkopf:
Avoiding Discrimination through Causal Reasoning. ArXiv 2017, to appear at NIPS 2017.
Understanding Deep Learning
Unsupervised Learning