Neural Net Workgroup

The Neural Net Theory Workgroup features short talks, followed by an interactive and informal discussion about current research in the theoretical aspects of the study of neural networks.

For any information, please contact the organizer, Clément Hongler.

Implicit Regularization of Random Feature Models
Arthur Jacot & Berfin Simsek
February 26, 2020, 17:15, CM 2

An empirical study of SGD, and the role of batch normalization in residual networks
Sam Smith
November 21, 2019, 17:15, MA B11

Two Analyses of Gradient-based Optimization for Wide Two Layer Neural Networks
Lénaïc Chizat
November 5, 2019, 17:15, CM4

Fisher Information Matrix of Wide Neural Networks and Effect of Batch Normalization
Ryo Karakida
November 12, 2019, 17:15, CM4

Florent Krzakala
June 19, 2019, 16:15, CM4

Trainability and Accuracy of Artificial Neural Networks
Eric Vanden-Eijden
May 22, 2019, 17:15, CO2

Neural Network Optimization and the Link with the Jamming Transition
Matthieu Wyart
March 23, 2019, 17:15, CM3

Neural Tangent Kernel
Clément Hongler
December 11, 2018, 17:15, CO2