- Summer school
2019/06/27: We’re teaching a one-day course on Optimization for Machine Learning and Deep Learning at the Paris DS3 summer school.
- Papers at ICML
2019/05/08: Four papers of our group were accepted at this year’s ICML conference: on decentralized SGD & gossip (video), signSGD and error compensation, multi-model training, and interpretable LSTMs.
- Optimization for ML course
2019/02/22: Optimization for Machine Learning – CS-439, is starting. All lecture materials are publicly available on our github.
- New papers by our group
2019/02/20: on local SGD (ICLR), greedy optimization (AISTATS), multilingual embeddings, code (WSDM), and keyphrase extraction, code (CoNLL) and word embeddings (NAACL)
- Applied Machine Learning Days
2019/01/10: We are co-organizing this year’s event with over 2000 attendees. MLO sessions include the Industry Track, Theory meets Practice Workshop, and PyTorch Workshop.
- Machine learning course starting
2018/09/18: The Machine Learning course CS-433 has started with over 530 students inscribed. Need a larger room!
- Papers at NeurIPS
2018/09/11: Four papers of our group were accepted at this year’s popular NeurIPS conference: CoLa, block-floating-point, Sparsified SGD (video), and BFGS.
- Application Impact
2018/09/09: We’re excited that some of our work found impact in diverse fun areas including Koala Activity Detection, Electric Vehicle Charging, Toxicogenomics and Victorian Era Authorship Attribution.
- MLBench – Distributed ML Benchmarking
2018/09/07: We’re launching MLBench, a public and reproducible collection of reference implementations and benchmark suite for distributed machine learning algorithms, frameworks and systems.
- COLA: Decentralized Linear Learning
2018/08/13: new paper and code for decentralized learning.
- Keyphrase Extraction
2018/07/27: the paper Simple Unsupervised Keyphrase Extraction using Sentence Embeddings was accepted at CoNLL 2018
- Faculty position for Aymeric at École Polytechnique in Paris
2018/06/01: Aymeric Dieuleveut, postdoc at MLO, has accepted a faculty position at École Polytechnique in Paris.
- Two new papers appearing at ICML
2018/05/11: On Matching Pursuit and Coordinate Descent by Francesco Locatello, Anant Raj, Sai Praneeth Karimireddy, Gunnar Rätsch, Bernhard Schölkopf, Sebastian U. Stich, Martin Jaggi and A Distributed Second-Order Algorithm You Can Trust by Celestine Dünner, Aurelien Lucchi, Matilde Gargiani, An Bian, Thomas Hofmann, Martin Jaggi.
- Google Focused Research Award
2017/03/23: We have been awarded a Google Focused Research Award 2018 in the area of Machine Learning, jointly with Alexandre d’Aspremont and Francis Bach.
- Algorithms in the wild
2018/03/20: IBM and NVIDIA announced their new partnership, citing 46x faster training of logistic regression, resulting from a combination of our DuHL and CoCoA algorithms, in front of 40’000 attendees at the IBM think conference.
- New Optimization for ML course
2018/02/23: A brand new course – Optimization for Machine Learning – CS-439, has started with 110 students inscribed. All lecture materials are publicly available on our github.
- sent2vec paper accepted at NAACL
2018/02/15: General purpose document embeddings, trained unsupervised. Get them while they are fresh! Congrats Matteo Pagliardini and Prakhar Gupta!
- AISTATS Paper on Adaptive First-Order Optimization
2018/02/01: Congrats to Praneeth Karimireddy and Sebastian Stich, for the paper Adaptive Balancing of Gradient and Update Computation Times using Global Geometry and Approximate Subproblems.
- Faster Linear Learning using GPUs
2017/12/08: Celestine’s NIPS paper was covered on HackerNews, as well as by IBM and Nvidia. Here is the 3min video.
- Video: Safe Adaptive Importance Sampling
2017/11/25: 3min video for Sebastian’s upcoming NIPS spotlight presentation
- sent2vec – features for text
2017/10/01: Our general purpose features for short texts have found many applications and already reached 100 (update: >600) stars on github. The representations are trained unsupervised, very efficient to compute, and can be used for any machine learning task later on.
- Project Funding from SNSF
2017/09/30: Our project “Reliable Open-Source Large-Scale Machine Learning” has received 3 years of funding from the Swiss National Science Foundation
- Machine Learning Course Started
2017/09/19: The Machine Learning course (CS-433) has started this week with over 440 students inscribed.
- NIPS conference: Importance Sampling, Heterogeneous Systems, and Cone Optimization
2017/09/04: Three papers were accepted at the upcoming NIPS conference: Safe Adaptive Importance Sampling (spotlight) by Sebastian Stich, Anant Raj and Martin Jaggi, Efficient Use of Limited-Memory Accelerators for Linear Learning on Heterogeneous Systems by Celestine Dünner, Thomas Parnell and Martin Jaggi, Greedy Algorithms for Cone Constrained Optimization with Convergence Guarantees by Francesco Locatello, Michael Tschannen, Gunnar Raetsch and Martin Jaggi
- Short Course on Optimization for Machine Learning
2017/07/05: Brief lecture notes and practical labs with solutions for the Pre-doc Summer School on Learning Systems in Zurich.
- Approximate Steepest Coordinate Descent accepted at ICML
2017/05/14: The paper by Sebastian Stich, Anant Raj and Martin Jaggi was accepted for publication at ICML 2017
- Google Research Award
2017/02/23: Our project “A Computational View on Sentence Embeddings” received a Google Faculty Research Award 2016 in the area of Machine Learning and Data Mining.
- Greedy and Coordinate Algorithms at AISTATS
2017/02/05: Two papers were accepted at the AISTATS conference: A Unified Optimization View on Generalized Matching Pursuit and Frank-Wolfe by Francesco Locatello, Rajiv Khanna, Michael Tschannen and Martin Jaggi – and Faster Coordinate Descent via Adaptive Importance Sampling by Dmytro Perekrestenko, Volkan Cevher and Martin Jaggi.
- Applied Machine Learning Days
2017/01/31: More than 450 participants from across industry and academia were attending the first instance of the Applied ML Days, organized by the labs of Marcel Salathé, Robert West and Martin Jaggi. Videos of the presentations are available here on youtube.
- Text Sentiment Analysis Paper Accepted at the WWW Conference
2017/01/03: Our paper on multi-lingual text sentiment analysis using convolutional neural networks was accepted at WWW 2017. This is joint work of Jan Deriu, Aurelien Lucchi, Valeria De Luca, Mark Cieliebak, Simon Müller, Aliaksei Severyn, Thomas Hofmann and Martin Jaggi.
- Welcome Mikhail and Sebastian
2016/12/01: Two amazing senior researchers have freshly started in our lab in December. Welcome Mikhail Langovoy and Sebastian U. Stich!
- Our Distributed Machine Learning Algorithm in TensorFlow
2016/09/20: Google has implemented our CoCoA+ algorithm as the default distributed solver in the popular TensorFlow framework, for linear machine learning models. The code is open source and our papers describing the method are available here, here, and here.
- Machine Learning Course Started
2016/09/20: The Machine Learning course (CS-433) has started this week with 298 students inscribed.
- Winner of the BirdCLEF 2016 Competition on Audio-Based Bird Species Classification
2016/09/06: The system developped by master student Elias Sprengel won this year’s international competition on bird species identification, by a deep learning approach with promising applications in ecology.
- Funding from Microsoft Research
2016/08/19: Our project “Coltrain: Co-located Deep Learning Training and Inference”, jointly with the lab of Babak Falsafi, has received two years of funding from Microsoft Research. 10 Projects have received funding under this new Swiss Joint Research Centre initiative.
- Master thesis of Dmytro Perekrestenko on Importance Sampling
2016/08/17: Dmytro Perekrestenko has finished and defended his master thesis “Faster optimization through adaptive importance sampling“.
- Start of Machine Learning and Optimization Laboratory
2016/08/01: The Machine Learning and Optimization Laboratory officially started at EFPL.
- Paper “Primal-Dual Rates and Certificates” at ICML
2016/06/19: New paper appearing at this year’s ICML conference “Primal-Dual Rates and Certificates”. Here is a poster of it. Our approach allows more optimization problems to be equipped with practical accuracy certificates, such as L1, elastic-net, TV and others. By Celestine Dünner, Simone Forte, Martin Takac, Martin Jaggi.
- Winner of the SemEval 2016 Competition on Text Sentiment Analysis
2016/06/17: Our system developed by master students Jan Deriu and Maurice Gonzenbach won this year’s text sentiment classification competition, placing first out of 34 teams from all over the world. Here is a news article, and our systems description paper.