Slides 2018


The 2018 course consists of the following topics

lecture 1 Introduction to convex optimization and iterative methods.
lecture 2 Review of basic probability theory.
  Maximum likelihood, M-estimators, and empirical risk minimization as a motivation for convex optimization.
lecture 3 Fundamental concepts in convex analysis.
  Basics of complexity theory.
lecture 4 Unconstrained smooth minimization I:
  Concept of an iterative optimization algorithm.
  Convergence rates.
  Characterization of functions.
lecture 5
Unconstrained smooth minimization II:
  Gradient and accelerated gradient methods.
lecture 6 Unconstrained smooth minimization III:
  The quadratic case.
  The conjugate gradient method.
  Variable metric algorithms.
lecture 7 Stochastic gradient methods.
lecture 8 Non-convex optimization.
  Neural networks.
  Convergence of SGD on nonconvex problems.
lecture 9 Composite convex minimization I.
  Subgradient method.
  Proximal and accelerated proximal gradient methods.
lecture 10 Composite convex minimization II.
  Proximal Newton-type methods.
  Stochastic proximal gradient methods.
lecture 11 Constrained convex minimization I.
  The primal-dual approach.
   Smoothing approaches for non-smooth convex minimization.
lecture 12 Constrained convex minimization II.
  The Frank-Wolfe method.
  The universal primal-dual gradient method.
  The alternating direction method of multipliers (ADMM).