Student Projects

If you are interested in working with us, here are some additional projects which we would be happy on working on!

Theoretical study of model agnostic algorithms for Meta-Learning.

Meta-learning leverages knowledge across different tasks to quickly learn new tasks. This allows learning good estimators even when the data for a single task are scarce. Model agnostic algorithms such as MAML or Reptile recently encountered many successes for meta-learning. The theoretical knowledge of these methods yet remains limited. This project aims to (partially) explain their empirical success through the study of simple theoretical models. 

For more info please contact Etienne.

Exploring the connection between sharpness and out-of-distribution performance.

It has been observed that the sharpness of a minimum is often well-correlated with the generalization of a deep learning model. This project aims to explore possible connections between the sharpness and out-of-distribution performance, as well as uncertainty estimation.

For more info please contact Maksym.

Improving our understanding of batch normalization.

It has been observed that batch normalization (BN) applied to deep networks leads to faster optimization and better generalization. This project aims to better understand which components of BN (centering, reducing variance, the introduction of new learnable parameters) are crucial to improving specifically the generalization performance. Moreover, we also want to study the interaction of BN with stochasticity (as stochastic gradient descent is commonly used to optimize deep networks), skip connections (as they are a part of modern architectures like ResNets), and weight decay (commonly used in neural network training). The project is expected to be mostly experimental.

For more info please contact Maksym.