This course is offered jointly by the TML and MLO groups. Previous year’s website: ML 2022.
See here for the ML4Science projects.
Contact us: Use the discussion forum. You can also email the head assistant Lara Orlandic, and CC both instructors.
Instructors: Nicolas Flammarion and Martin Jaggi
|Lectures||Tuesday||16:15 – 18:00||in Rolex Learning Center
|Wednesday||10:15 – 12:00||in Rolex Learning Center|
|Exercises||Thursday||14:15 – 16:00|
|Credits :||8 ECTS|
- Exam Date: Thursday 18.01.2024 from 15h15 to 18h15 (STCC – Garden Full)
- The links for the exercises signup and the discussion forum. All other materials are here on this page and github.
Projects: There will be two group projects during the course.
Project 1 counts 10% and is due Oct 30th.
Project 2 counts 30% and is due Dec 21st.
Code Repository for Labs, Projects, Lecture notes: github.com/epfml/ML_course
the exam is closed book but you are allowed one crib sheet (A4 size paper, both sides can be used); bring a pen and white eraser; you find the exams from the past years with solutions here:
|19/9||Introduction, Linear Regression||01a,01b
|20/9||Loss functions||Lab 1|
|27/9||Optimization||Lab 2||Project 1 start|
|03/10||Least Squares, Overfitting||03a,03b|
|04/10||Max Likelihood, Ridge Regression, Lasso||03c,03d||Lab 3|
|10/10||Generalization, Model Selection, and Validation||04a|
|11/10||Bias-Variance decomposition||04b||Lab 4|
|18/10||Logistic Regression||05b||Lab 5|
|24/10||Support Vector Machines||06a|
|25/10||K-Nearest Neighbor||06b||Lab 6|
|31/10||Kernel Regression||7a||Proj. 1 due 30.10.|
|01/11||Neural Networks – Basics, Representation Power||7b||Lab 7|
|07/11||Neural Networks – Backpropagation, Activation Functions||8a||Project 2 start|
|08/11||Neural Networks – CNNs, Regularization, Data Augmentation, Dropout||8b||Lab 8|
|14/11||Neural Networks – Transformers||9a|
|15/11||Adversarial ML||9b||Lab 9|
|21/11||Ethics and Fairness in ML||10a, Ethics canvas|
|22/11||Unsupervised Learning, K-Means||10b, 10c||Lab 10|
|28/11||Gaussian Mixture Models||11a|
|29/12||EM algorithm||11b||Lab 11 & Project Q&A|
|06/12||Text Representation Learning||Lab 12|
|13/12||Generative models||Lab 13|
|19/12||Guest lecture by Devis Tua|
|20/12||Projects pitch session (optional)||Proj. 2 due 21.12.|
Gilbert Strang, Linear Algebra and Learning from Data
Christopher Bishop, Pattern Recognition and Machine Learning
Shai Shalev-Shwartz, Shai Ben-David, Understanding Machine Learning
Michael Nielsen, Neural Networks and Deep Learning
Projects & ML4Science
Projects are done either in ML4Science in collaboration with any lab of EPFL, UniL or other academic institution, or the Reproducibility Challenge for ML papers, or one of the predefined ML challenges.
All info about the interdisciplinary ML4Science projects is available on the separate page here.