Master Thesis Projects are started once the complete master program is finished and all the credits have been obtained.
Projects for SSC and SIN students should last 4 months at the EPFL or 6 months in the industry or in another University.
Master Thesis Projects must be done individually.
Master Thesis Projects are worth 30 credits.
Students must have the approval of the Professor in charge of the laboratory before registering for the given project.
List of Projects – Autumn 2023
Bio-implementation of a Continual Learning Method: Adapting GaNON to Unsupervised Learning Using CLAPP and Three-Factor Learning Rules.
We recently developed in the lab a continual learning method that demonstrates the ability to learn new tasks while preserving existing knowledge called GaNON. We propose a project to further enhance the biological plausibility of this method. We aim to adapt GaNON to an unsupervised learning method called CLAPP. The project consists in adapting the GaNON method to local three-factor learning rules. This project is not a spiking model implementation.
- Investigate the principles and mechanisms of the GaNON continual learning method and CLAPP network.
- Derive local learning rules for the GaNON method in the CLAPP framework
- Evaluate the performance of the adapted GaNON method in various learning tasks and compare it with existing continual learning methods.
- Explore potential applications and future directions for the bio-implementation of GaNON.
- We expect the students to have a strong comprehension of deep neural networks.
- Knowledge in Bio-inspired neural network.
- Python programming ( PyTorch is not required but useful)
Interested students can contact Martin Barry and Ariane Delrocq for more details, and send their CV and grades of relevant courses: [email protected] and [email protected].
List of Projects – Spring 2023
Mesoscopic modeling of hidden spiking neurons
In order to model the recorded spike trains with spiking neural networks, one has to consider the latent effect from unobserved neurons.
In our previous work , we derived a neuronally-grounded latent variable model and showed that it can effectively model the large number of hidden neurons on synthetic datasets.
In this project, we hope to extend the previous mesoscopic model to a larger family of neuron models, i.e., adaptive leaky integrate-and-fire neuron .
The student should have a good background in programming (python). Basics in deep learning and/ or interests in computational neuroscience would be a plus.
Interested students can contact Shuqi Wang and Valentin Schmutz for more details, and send their CV and grades of relevant courses: [email protected] and [email protected].
Feel free to contact us for questions and infos.
 Wang, S., Schmutz, V., Bellec, G., & Gerstner, W. (2022). Mesoscopic modeling of hidden spiking neurons. arXiv preprint arXiv:2205.13493.
 Schwalger, T., Deger, M., & Gerstner, W. (2017). Towards a theory of cortical columns: From spiking neurons to interacting neural populations of finite size. PLoS computational biology, 13(4), e1005507.