Machine Learning CS-433 – 2017

This is the OLD 2017 course website. For the current one, see here.

This course is offered jointly with the Information Processing Group. (Course formerly known as Pattern Classification and Machine Learning).
Previous year’s website: http://www.epfl.ch/labs/mlo/page-136795.html

Our contact email: [email protected]

Instructor Martin Jaggi Instructor Ruediger Urbanke
Office INJ 341 Office INR 116
Phone +41 21 69 37059 Phone +41 21 69 37692
Email [email protected] Email [email protected]
Office Hours By appointment Office Hours By appointment
Teaching Assistant Barbier Jean Email [email protected] Office INR139
Teaching Assistant Karimireddy Sai Praneeth Email [email protected] Office INJ339
Teaching Assistant Liu Wei Email [email protected] Office INR038
Teaching Assistant Lu Jun Email [email protected]
Teaching Assistant Maksai Andrii Email [email protected] Office BC307
Teaching Assistant Zhou Ruofan Email [email protected] Office BC366
Student Assistant Ajalloeian Ahmad Email [email protected]
Student Assistant Benyahia Yassine Email [email protected]
Student Assistant Borgeaud dit Avocat William Email [email protected]
Student Assistant Castellon Arevalo Joel Email [email protected]
Student Assistant Champenois Bertrand Email [email protected]
Student Assistant Charollais Clément Email [email protected]
Student Assistant Gucevska Natalija Email [email protected]
Student Assistant Kunstner Frederik Email [email protected]
Student Assistant Lahoud Fayez Email [email protected]
Student Assistant Moreau Hugo Email [email protected]
Lectures Tuesday 17:15 – 19:00  Room: CO1
Thursday 16:15 – 18:00  Room: SG1
Exercises Thursday 14:15 – 16:00  Room: INF119,INJ218,INM11,INM202
Language: English
Credits : 7 ECTS

For a summary of the logistics of this course, see the course info sheet here (PDF).
(and also here is a link to official coursebook information).

Special Announcements

  • The new exercise sheet, as well as the solution (code only) for last weeks lab session will typically be made available each tuesday (here and on github).
  • Some old final exams: final2016, solutions
    Some old mock exams: 2016,2015,2014
  • Projects: There will be two group projects during the course.
    • Project 1 counts 10% and is due Oct 30st.
    • Project 2 counts 30% and is due Dec 21nd.
  • Labs and projects will be in Python. See Lab 1 to get started.
  • Code Repository for Labs, Projects, Lecture notes: github.com/epfml/ML_course
  • Lectures: Clicker: For some active participation in the lectures, please point your browser to this speak-up room
  • The FINAL EXAM is scheduled for January 17th 2018, from 16:15-19:15 in STCC08328 ; the exam is closed book but you are allowed one crib sheet (A4 size paper, both sides can be used), either handwritten or 11 point minimum font size; bring a pen (you cannot use pencils for the MC part due to technical reasons); the exam will have  20+ MCQs (multiple possible answers and negative poins for wrong answers) and 4 regular questions;  this year the MCQ part will be graded automatically, so please follow the instructions carefully in how to mark your answers; you find the exam from last year with solutions posted a few lines above;
  • Solution to final exam.

Detailed Schedule

All 2017 materials are available on github here (use this instead of the individual links below, or check the 2018 website)

Annotated lecture notes from each class are made available on github here.

Date Topics Covered Lectures Exercises Projects
19/9 Introduction 01a,01b
21/9 Linear Regression, Cost functions 01c,01d Lab 1
26/9 Optimization 02a
28/9 Optimization Lab 2
03/10 Least Squares, ill-conditioning, Max Likelihood 03a,03b Project 1 details
05/10 Overfitting, Ridge Regression, Lasso 03c,03d Lab 3
10/10 Cross-Validation 04a
12/10 Bias-Variance decomposition 04b Lab 4
17/10 Classification 05a
19/10 Logistic Regression 05b Lab 5
24/10 Generalized Linear Models 06a
26/10 K-Nearest Neighbor 06b Q&A for proj.
31/10 Support Vector Machines 07a Proj. 1 due 30.10.
02/11 Kernel Regression 07b Lab 7
07/11 Unsupervised Learning, K-Means 08a,08b Project 2 details
09/11 K-Means, Gaussian Mixture Models 08c Lab 8
14/11        Mock Exam
16/11 Gaussian Mixture Models, EM algorithm 09a Mock exam Solutions
21/11 Matrix Factorizations 10a
23/11 Text Representation Learning 10b Lab 10
28/11 SVD and PCA 11a
30/11 SVD and PCA and Neural Networks – Basics 12a Lab 11
05/12 Neural Networks – Representation Power 12b
07/12 Neural Networks – Backpropagation, Activation Functions 12c,12d Q&A for proj.
12/12 Neural Networks – CNN, Regularization, Data Augmentation, Dropout 13a,13b
14/12 Graphical Models – Bayes Nets 13c Lab 13
19/12 Graphical Models – Factor Graphs 14a,14b
21/12 Graphical Models – Inference and Sum-Product Algorithm Proj. 2 due 21.12.

Textbooks

(not mandatory)

Christopher Bishop, Pattern Recognition and Machine Learning
Kevin Murphy, Machine Learning: A Probabilistic Perspective
Shai Shalev-Shwartz, Shai Ben-David, Understanding Machine Learning
Michael Nielsen, Neural Networks and Deep Learning