Angelia Nedich – Large-scale and Distributed Optimization

This is an advanced course on optimization problems with particular focus on large scale problems. The course shall briefly review the fundamental optimization theory and algorithmic approaches.  The exposure to the theory is tailored to the level necessary for understanding the crucial aspects of solving optimization problems. The course shall keep strong emphasis on recent advances for solving large-scale optimization problems, where the large-scale refers to large number of decision variables, large number of constraints, or large number of component functions comprising the optimization objective. The application areas that may include image/signal processing, control, and machine learning among others. 

Selected Papers:

Lecture Material:

Marco Cuturi -Differentiating through Optimal Transport

Optimal transport theory is the branch of mathematics that aims at studying and generalizing the idea of optimal matchings / assignments between two groups of observations.I will present in this course an introduction to that theory, motivating its use in ML with some examples, following with some of the best known  theoretical results in the field. After presenting a few closed form solutions for the problem (which have proved very useful in applications), I will then show why the direct plug-in estimation of OT runs into several issues: computational complexity, sample complexity and lack of a meaningful notion of differentiability (e.g. how an optimal matching varies with changes in inputs). I will then detail how regularization can help solve this issues and how to implement these approaches.


  • On Sunday, a welcome cocktail is organized at 18:30 in the main restaurant of Hotel Europe.
  • Two 90 minute lectures are organized every morning.
  • On Monday and Tuesday, workshops are organized to work on papers of the two lecturers. See details here.
  • One Wednesday and Thursday, the groups present the papers prepared during the workshops. 
  • Each group presents for 20 minutes followed by 10 minutes of questions and discussions.
  • On Thursday evening, the workshop dinner is organized at 19:30.
  • On Friday, we adjourn the winter school at 12:00.

Groups for the student presentations

Wednesday 17:00 – 17:30

  • Rahul Gupta (EPFL)
  • Alatur Pragnya (ETH)
  • Negar Rezvany (EPFL)
  • Simone Rametti (EPFL)

Shi Pu, Wei Shi, Jinming Xu, Angelia Nedić, Push-Pull Gradient Methods for Distributed Optimization in Networks, IEEE Transactions on Automatic Control, 2021.

Wednesday 17:30 – 18:00

  • Christopher Criscitiello (EPFL),
  • Louis Bouvier (Ecole des Ponts ParisTech)
  • Marija Kukic (EPFL)
  • Francesco Gerini (EPFL)

Marco Cuturi, Arnaud Doucet, Fast Computation of Wasserstein Barycenters, International Conference on Machine Learning, 2014.

Wednesday 18:00 – 18:30

  • Plouton Grammatikos (EPFL)
  • Marloes Remijnse (Eindhoven University of Technology)
  • Tom Häring (EPFL)
  • Jasone Ramirez-Ayerbe

Angelia Nedić, Ion Necoara, Random minibatch projection algorithms for convex problems with functional constraints, IEEE Conference on Decision and Control, 2019.

Wednesday 18:30 – 19:00

  • Yang Shaohui (EPFL)
  • Tianshu Yang (EPFL)
  • Willem Lambrichts (EPFL)
  • Anna Konovalenko (Molde University College)

Meyer Scetbon, Marco Cuturi, Gabriel Peyré, Low-Rank Sinkhorn Factorization, International Conference on Machine Learning, 2021.