Student Projects

  • We offer a wide variety of projects in the areas of Machine Learning, Optimization and applications. The list below is not complete but serves as an overview.
  • Students who are interested to do a project at the MLO lab are encouraged to have a look at our

    where we describe what you can expect from us (and what we expect from you).

  • If you have not (yet) taken our courses we might ask for your grade sheet for getting to better know your background in the topic area.
  • We are only able to supervise a limited number of projects each semester. Priority will be given to Master Thesis Projects (full time).

Available MSc, BSc and PhD Semester Projects

  • Landscape analysis and second-order methods in deep learning
    We are interested in studying the geometry of loss landscapes in deep neural networks and generalization properties of its stationary points. We want to use visualization tools to plot 2d and 3d projections of the surfaces to distinguish sharp and wide minimums and the regions with saddle points. Then, we want to compare the trajectories of first-order optimization algorithms and cubically regularized Newton, which provably converges to a second-order local minimum.
    Contact: Nikita Doikov & Martin Jaggi
  • Large Batch Training
    A study of the performance of recent deep learning optimizers for large batch training. Experiment with variations of optimizers, learning rate schedules and tricks such as gradient clipping to identify which aspects are most important for high performance large batch training. Contact: Atli Kosson
  • Several projects in collaborative, decentralized and federated learning
    Theoretical and practical projects, mostly master level –
    in addition to the projects offered below (some are taken already). Please let us know about your motivation, interested, and your background. Contact: Sebastian Stich & Hadrien Hendrikx & Martin Jaggi
  • Privacy-preserving robust training with zero-knowledge proofs
    In distributed training, we can combine basic secure aggregation schemes to sum up gradients, with Euclidean distance checks, for robustness and personalization during training.
    Contact: Lie He & Martin Jaggi
  • Build a decentralized ML framework in the browser
    (Practical project)
    Join our larger team project to build a decentralized (and federated) training software, where many clients (e.g. mobile phones or hospitals) can collaboratively train a joint ML model, while respecting data privacy and locality. Combines algorithmic and practical challenges. Code mostly in JavaScript. Contact: Martin Jaggi (and you can directly join the slack channel)
  • Performance optimization of PowerSGD in PyTorch on GPUs
    Our PowerSGD algorithm is now part of PyTorch. PowerSGD compresses the communication between workers in distributed optimization to speed up the training. Of course, communication compression is only useful if the operation of compression takes less time than the time spent communicating uncompressed. Inspired by influential work by OpenAI, this project will go deep into optimizing the compression performance of PowerSGD on GPUs. If the project is successful, your work might be integrated into PyTorch.
    We are looking for someone with CUDA experience. Contact: Thijs Vogels
  • Collaborative & Federated Learning
    • Efficient Collaborative Deep Learning Training with heterogeneous data
      In current distributed SGD algorithms (centralized/decentralized), when data are heterogeneously (non-iid) distributed between workers (e.g. every worker has training examples only from one class), the generalization of the resulting models is often poor. This project is aimed to better understand the reasons and possible solutions of this problem. Contact: Tao Lin, Anastasia Koloskova, Sebastian Stich, Martin Jaggi
    • Federated Learning for Once-for-all Neural Networks
      Please read the project description here, and contact Tao Lin for more details.
      • Efficient training methods for FL
      • Training methods for heterogeneous settings (data imbalance, compute imbalance, etc.)
    • Personalized Collaborative Learning: theoretical and practical projects available. For example: learning collaboration weights, to identify helpful collaborators. Contact Martin Jaggi or Sebastian Stich for further information.
  • Text Steganography using Transformers
    Implement a new steganography scheme which translates source text on the fly into equally long and natural ciphertext. Contact: Martin Jaggi & Prakhar Gupta
  • Text Summarization using Transformers
    Implement text generation and summarization methods which can vary the reading level difficulty of the produced text. Contact: Martin Jaggi
  • Model-Parallel Training of Deep Learning
    (Practical project or Theory project, on efficient training methods, such as e.g. Pipe Dream rely on ‘asynchronous back-propagation’. Can we understand the effect this asynchronicity from a theoretical point of view? See also this paper.
    Contact: Tao Lin & Atli Kosson & Sebastian Stich
  • Low Precision Arithmetic
    (mostly theory)
    Most hardware DL accelerators rely on limited precision arithmetic. We start by compiling a survey of existing techniques (with corresponding error bounds) and will explore the possibilities of error feedback sparsified SGD). Contact: Sebastian Stich & Tao Lin
  • Alternative Arithmetic for Deep Networks
    We want to explore alternative arithmetics for deep nets, for example by getting rid of all multiplications, as they are expensive. We will continue preliminary experiments for such alternative neuron operations in PyTorch or jax, both for training and inference, and also investigate the practical and theoretical efficiency. Contact: Martin Jaggi

Contact us for more details!

Interdisciplinary Projects

See here for the full list of interdisciplinary health-related projects of our initiative iGH (intelligent Global Health).

Additional interdisciplinary projects (non-medical):

  • ML for Advanced Manufacturing

In the context of manufacturing nano-scale sensors consisting of individual carbon nanotubes, we use a machine learning approach to classify Raman spectroscopy data. Based on a large labelled dataset of 1-dimensional Raman Spectroscopy measurements, we detect for each measurement position the presence of a healthy nanotube as well as its orientation. In addition to standard ML techniques, we also employ unsupervised signal representation learning based on CNNs in a semi-supervised setting.
Contact: Martin Jaggi

  • Dream Prediction from EEG data

(Project only available full-time) The goal of this project is to apply machine learning to predict when (and what!) a sleeper is dreaming based on electroencephalogram (EEG) time series. We have access to a large dataset consisting of 500 Hz signals from 256 electrodes over the 2 minutes before awakening, labeled by whether the sleeper remembered a dream or not. We will propose to use a convolutional neural network (CNN) to classify these EEG time series.
Contacts: Francesca Siclari, CHUV and Martin Jaggi, EPFL



Completed Thesis Projects

Master Theses (internal at EPFL):

Liamarcia Bifano, 2020, A machine learning platform to generate data-responsive and medically approved clinical decision algorithms
Pei Wang, 2020, Transformer-Based Multi-lingual Sentence Embeddings
Ahmad Ajalloeian, 2019, Stochastic Zeroth-Order Optimisation Algorithms with Variance Reduction
Akhilesh Gotmare, 2019, Layerwise Model Parallel Training of Deep Neural Networks
Andreas Hug, 2018, Unsupervised Learning of Embeddings for Detecting Lexical Entailment
Jean-Baptiste Cordonnier, 2018, Convex Optimization using Sparsified Stochastic Gradient Descent with Memory
Lie He, 2018, COLA: Communication-Efficient Decentralized Linear Learning
Wang Zhengchao,  2017,  Network Optimization for Smart Grids
Marina Mannari,  2017,  Faster Coordinate Descent for Machine Learning through Limited Precision Operations
Matilde Gargiani,  2017,  Hessian-CoCoA: a general parallel and distributed framework for non-strongly convex regularizers

Namhoon Lee, 2020, Training compressed/sparse deep learning models
Srijan Bansal, 2019, Adaptive Stepsize Strategies for SGD
Shreshta Alevooru, 2019, Fairness Objectives for Training Word Embeddings
Xu Lu, 2019, Cross-lingual transfer learning and distillation with transformers
Scott Pesne, 2019, Convergence diagnostics for SGD
Ali Sabet, 2019, Robust Cross-lingual Embeddings from Parallel Sentences
Éloïse Berthier, 2019, Differential Privacy
Myriam Bégel, 2019, Medical Machine Learning
Lie He, 2018, CoLA and MLbench
Jean-Yves Franceschi, 2018, Unsupervised Scalable Representation Learning for Multivariate Time Series
Riccardo de Lutio, 2018, Medical Machine Learning
Evann Courdier, 2018, Learning World Models (jointly with Francois Fleuret)
Polina Kirichenko,  2018,  Zero-order Optimization for Deep Learning
R S Nikhil Krishna,  2018,  Importance Sampling and LSH
Prashant Rangarajan,  2018,  Multilingual matrix factorizations
Jeenu Grover,  2018,  Learning 2 Learn
Anastasia Koloskova, 2017,  Coordinate Descent using LSH
Vasu Sharma,  2017,  CNNs for Unsupervised Text Representation Learning
Pooja Kulkarni,  2017,  Variable metric Coordinate Descent
Tao Lin,  2017,  Adversarial Training for Text
Tina Fang,  2017,  Generating Steganographic Text with LSTMs
Valentin Thomas,  2017,  Model-parallel back-propagation
Anahita Talebi Amiri,  2017,  Lasso – Distributed and Pair-Wise Features

Semester Projects:
Aryan Agal, 2019, Effects of Varying Integration Time and Beam Power on Machine Learning for Raman Spectroscopy
Lingjing Kong, 2019, Extrapolation for Large-batch Training in Deep Learning
Mingbo Cui, 2019, babyBERT: a distilled faster and smaller BERT
Sadra Boreiri, 2019, Decentralized SGD with Changing Topology and Local Updates
Fedor Moiseev, 2019, Efficient Federated Deep Learning with Non-IID Data
Harshvardhan, 2019, Convergence Analysis of SGD in the Large-Batch Regime
Mohamed Ndoye, 2019, Collaborative privacy: Incentivizing sharing of medical data among competing parties in rural settings
Damian Dudzicz, 2019, Decentralized Optimization with Local Push-Sum SGD
Oriol Barbany Mayor, 2019, New Efficient Training Methods for Robust Models
Rayan Elalamy, 2019, Convergence Analysis for Hogwild! under less Restrictive Assumptions
Lionel Gerard, 2019, Stochastic Extragradient for GAN Training
Jan Benzing, 2019, A Machine-Learning approach for imputation of missing values in a biomedical dataset of febrile patients in Tanzania
Hongyu Luo, 2019, Dream Detection from EEG Time Series
Nicola Ischia, 2019, Dream Detection from EEG Time Series
Sidakpal Singh, 2019, Structure-aware model averaging with Optimal Transport
Brock Grassy, 2019, Nano Manufacturing with ML and Raman Spectroscopy
Atul Kumar Sinha, 2019, Unsupervised Sentence Embeddings Using Transformers
Hajili Mammad, 2019, Unsupervised Sentence Embeddings Using Transformers
Claire Capelo, 2019, Adaptive schemes for communication compression: Trajectory Normalized Gradients for Distributed Optimization
Aakash Sunil Lahoti, 2019, Theoretical Analysis of Minimum of Sum of Functions
Jelena Banjac, 2019, Software Tools for Handling Magnetically Collected Ultra-thin Sections for Microscopy
Nikita Filippov, 2019, Differentially Private Decentralized Batch SGD Under Varied Conditions
Devavrat Tomar, 2019, Neural Voice Conversion
Lingjing Kong, 2019, Adaptive Methods for Large Batch Training
Peilin Kang, 2019, A comparison of model-parallel training methods for deep learning
Nicolas Lesimple, 2019, Automated Machine Learning
Ezgi Yuceturk, 2018, Dream Detection from EEG Time Series
Delisle Maxime, 2018, Twitter Demographics
Bojana Rankovic, 2018, Handwritten Text Recognition on Student Essays
Cem Musluoglu, 2018, Quantization and Compression for Distributed Optimization
Quentin Rebjock, 2018, Error Feedback Fixes SignSGD and other Gradient Compression Schemes
Jimi Vaubien, 2018, Derivative-Free Empirical Risk Minimization
Ali Hosseiny, 2018, Human Query – From natural language to SQL
Servan Grüninger, 2018, Location prediction from tweets
Marie-Jeanne Lagarde, 2018, Steganographic LSTM
Sidakpal Singh, 2018, Context Mover’s Distance & Barycenters: Optimal transport of contexts for building representations
Arthur Deschamps, 2018, Simulating Asynchronous SGD + numerical results
Chia-An Yu, 2018, Feedback & quantization in SGD
Kshitij Kumar Patel, 2018, Communication trade-offs for synchronized distributed SGD with large step size
Martin Josifoski, 2018, Cross-lingual word embeddings
William Borgeaud,  2017,  Adaptive Sampling in Stochastic Coordinate Descent
Castellón Arevalo Joel,  2017,  Complexity analysis for AdaRBFGS: a primitive for methods between first and second order
Alberto Chiappa,  2017,  Asynchronous updates for Stochastic Gradient Descent
Ahmed Kulovic,  2017,  Mortality Prediction from Twitter
Arno Schneuwly,  2017,  Correlating Twitter Language with Community-Level Health Outcomes
Sina Fakheri,  2017,  A Machine-learning Mobile App to support prognosis of Ebola Virus Diseases in an evolving environment
Remy Sun,  2017,  A Convolutional Dictionary Method to Acquire Sentence Embeddings
Hakan Gökcesu,  2016,  Distributed SGD with Fault Tolerance
He Lie,  2017,  Distributed TensorFlow implementation of sparse CoCoA
Oberle Jeremia,  2016,  A Machine-Learning Prediction Tool for the Triage and Clinical Management of Ebola Virus disease
Akhilesh Gotmare,  2016,  ADMM for Model-Parallel Training of Neural Networks

Francesco Locatello: Greedy Optimization and Applications to Structured Tensor Factorizations,
Master thesis, ETH, September 2016

Dmytro Perekrestenko: Faster Optimization through Adaptive Importance Sampling,
Master thesis (jointly supervised with Volkan Cevher), EPFL, August 2016

Elias Sprengel: Audio Based Bird Species Identification using Deep Learning Techniques,
Master thesis (jointly supervised with Yannic Kilcher), ETH, August 2016

Jonathan Rosenthal: Deep Learning for Go
Bachelor thesis (jointly supervised with Yannic Kilcher and Thomas Hofmann), ETH, June 2016

Maurice Gonzenbach: Sentiment Classification and Medical Health Record Analysis using Convolutional Neural Networks,
Master thesis (jointly supervised with Valeria De Luca), ETH, May 2016

Jan Deriu: Sentiment Analysis using Deep Convolutional Neural Networks with Distant Supervision,
Master thesis (jointly supervised with Aurelien Lucchi), ETH, April 2016

Pascal Kaiser: Learning city structures from online maps,
Master thesis (jointly supervised with Aurelien Lucchi and Jan Dirk Wegner), ETH, March 2016

Adrian Kündig: Prediction of Cerebral Autoregulation in Intensive Care Patients,
Master thesis (jointly supervised with Valeria De Luca), ETH, January 2016

Bettina Messmer: Automatic Analysis of Large Text Corpora,
Master thesis (jointly supervised with Aurelien Lucchi), ETH, January 2016

Tribhuvanesh Orekondy: HADES: Hierarchical Approximate Decoding for Structured Prediction,
Master thesis (jointly supervised with Aurelien Lucchi), ETH, September 2015

Jakob Olbrich: Screening Rules for Convex Problems,
Master thesis (jointly supervised with Bernd Gärtner), ETH, September 2015

Sandro Felicioni: Latent Multi-Cause Model for User Profile Inference,
Master thesis (jointly supervised with Thomas Hofmann, and 1plusX), ETH, September 2015

Ruben Wolff: Distributed Structured Prediction for 3D Image Segmentation,
Master thesis (jointly supervised with Aurelien Lucchi), ETH, September 2015

Simone Forte: Distributed Optimization for Non-Strongly Convex Regularizers,
Master thesis (jointly supervised with Matthias Seeger, Amazon Berlin, and Virginia Smith, UC Berkeley), ETH, September 2015

Xiaoran Chen: Classification of stroke types with SNP and phenotype datasets,
Semester project (jointly supervised with Roqueiro Damian and Xiao He), ETH, June 2015

Yannic Kilcher: Towards efficient second-order optimization for big data,
Master thesis (jointly supervised with Aurelien Lucchi and Brian McWilliams), ETH, May 2015

Matthias Hüser: Forecasting intracranial hypertension using time series and waveform features,
Master thesis (jointly supervised with Valeria De Luca), ETH, April 2015

Lei Zhong: Adaptive Probabilities in Stochastic Optimization Algorithms,
Master thesis, ETH, April 2015

Maurice Gonzenbach: Prediction of Epileptic Seizures using EEG Data,
Semester project (jointly supervised with Valeria De Luca), ETH, Feb 2015

Julia Wysling: Screening Rules for the Support Vector Machine and the Minimum Enclosing Ball,
Bachelor’s thesis (jointly supervised with Bernd Gärtner), ETH, Feb 2015

Tribhuvanesh Orekondy: dissolvestruct – A distributed implementation of Structured SVMs using Spark,
Semester project, ETH, August 2014

Michel Verlinden: Sublinear time algorithms for Support Vector Machines,
Semester project, ETH, July 2011

Clément Maria: An Exponential Lower Bound on the Complexity of Regularization Paths,
Internship project (jointly supervised with Bernd Gärtner), ETH, August 2010

Dave Meyer: Implementierung von geometrischen Algorithmen für Support-Vektor-Maschinen,
Diploma thesis, ETH, August 2009

Gabriel Katz: Tropical Convexity, Halfspace Arrangements and Optimization,
Master’s thesis (jointly supervised with Uli Wagner), ETH, September 2008