Completed Master Projects @ LIONS

2023/2024

Master projects fall

  1. Yihang Chen “Generalization of Deep ResNets in the Mean-Field Regime”
  2. Justin Deschenaux “Energy based models for incorrectly discretized labels”
  3. Edmund Hofflin “Generalisation Guarantees of Over-Parametrised Neural Networks”

2022/2023

Master projets spring

  1. Yixin Cheng “Multilinear Operator Networks for Image Recognition”
  2. Alec Flowers “Investigating Gram Geometry in Deep Networks with Normalization”
  3. Ioannis Mavrothalassitis “Instance Optimal Finite Sum Minimization”

Master projets fall

  1. Tushar Goel “Attention based Machine Learning for Security Alert Classification”
  2. Arnaud Guibbert “Knowledge graph abstraction layers”
  3. Berk Mandiracioglu “Unsupervised Document Clustering”
  4. Ke Wang  “Generalization of transformers and CNNs outside of their (training) distribution”
  5. Shijian Xu “Weight Averaging for Out-of-Distribution Generalization and Few-Shot Domain Adaptation”

2021/2022

Master projets fall

  1. Kiarash Farivar ” Adversarial examples and loss-task alignment”
  2. Florian Genilloud “AI model for simulation racing driving, learning how to steer, brake & accelerate
  3. Alexandre Hutter ” Self-learning Elevator Controller: Improving Reward Expressivity and Reinforcement Learning Models Scalability with Adaptive Reward Normalization”
  4. Jonas Morin “Robust domain adaptation”
  5. Zhenyu Zhu “Convergence and Generalization of Neural Architecture Search: A Finite-width NTK perspective”

Master projets spring

  1.  Dimitrios Chalatsis “Trainability of High Degree Polynomial Networks”
  2. Alessandro Fornaroli “Variational Autoencoders for Incorrectly Discretized Labels”

2020/2021

Master projets fall

  1. Candeias Martins “Multi-agent Reinforcement Learning for Elevator Task Assignment”
  2. Patrick Ley “Few shot learning for pharmaceutical production line monitoring”
  3. Devavrat Tomar “Realistic Ultrasound Imaging Simulation using Deep Learning”
  4. Christine Whiteley “Self-learning elevator controller”
  5. Jiahua Wu “Learning representations as goals for EDA-RL”

Master projets spring

  1. Vincent Cabrini “Self-learning Elevator Controller : Optimizing Elevator Control Robustness with Graphs Neural Networks”
  2. Choraria Moulik “The Inductive Bias of Polynomial Neural Networks”
  3. Mouhammad Haddad “Machine learning for true circuit EDA”
  4. Andrej Janchevski “Graph embedding methods for graph completion”
  5. Florian Ravasi “Reinforcement learning for hydropower plants optimization”
  6. Lombardía Roldán “Threat Detection with Graph-based Machine Learning”
  7. Murat Topak “Towards efficient deep policy networks for large scale circuit RL”

2019/2020

Master projets fall

  1. Hedi Driss “Geologically Supervised Log Labelling”

Master projets spring

  1. Yu-Ting Huang “Robust Adversarial Training in Reinforcement Learning and Inverse Reinforcement Learning”
  2. Nicola Ischia “Self-Learning Elevator Controller: Optimizing Elevators Control with Deep Reinforcement Learning”
  3. Zhaodong Sun “Solving Inverse Problems with Hybrid Deep Image Priors: the challenge of preventing overfitting”

2018/2019

Master projets fall

  1. Samuel Beuret “Understanding the time-data trade-offs in statistical learning”
  2. Gauillaume Raille “Data Augmentation for Machine Learning in Low Resourced Environments”

Master projets spring

  1. Augustin Prado “Determining Accurate and Calibrated Uncertainties in Neural Networks”

2017/2018

Master projets fall

  1. Thomas Sanchez “Learning-Based Non-Cartesian Compressed Sensing for Dynamic MRI”

Master projets spring

  1. Mateusz Paluchowski “Unsupervised Document Topic Classification and Retrieval”

2016/2017

Master projets spring

  1. Maximilian Mordig “Truncated Variance Reduction in the Batch and Distributed Settings”
  2. Paul Rolland “High Dimensional Bayesian Optimization via Additive Kernel Learning”

2015/2016

Master projets fall

  1. Jérôme Bovay “Sound-Based Elevator Monitoring”

Master projects spring

  1. Dmytro Perekrestenko “Faster Optimization through Adaptive Importance Sampling”