Theory of Neural Nets Seminar

This seminar consists of talks about current research on the theory of neural networks. Every session lasts one hour and comprises a talk (about 30 minutes) followed by a discussion with questions from the audience.

Starting from May 2021, sessions will be held twice a month on Mondays 16:30 CEST. Until the sanitary situation allows physical sessions, the seminar will be held virtually. It is open to everyone.

For any information, please contact François Ged: francois.ged[at]epfl.ch

 


Past talks

July, 5th 2021 at 16:30 CEST

Logistic regression explicitly maximizes margins; should we stop training early?
Matus Telgarsky, University of Illinois

June, 21st 2021 at 16:30 CEST

Mode Connectivity and Convergence of Gradient Descent for (Not So) Over-parameterized Deep Neural Networks
Marco Mondelli, IST Austria

June, 14th 2021

Stochastic gradient descent for noise with ML-type scaling
Stephan Wojtowytsch, Princeton University

May, 31st 2021 at 16:30 CEST

On the Benefit of using Differentiable Learning over Tangent Kernels
Eran Malach, Hebrew University

May, 10th 2021 at 16:30 CEST

Feature Learning in Infinite-Width Neural Networks
Greg Yang, Microsoft Research