“Random subspace methods for non-convex optimization“
Friday March 10, 2023 | Time 9:00am CET
In this talk, we present a randomized subspace regularized Newton method for a non-convex function. We show that our method has global convergence under appropriate assumptions, and its convergence rate is the same as that of the full regularized Newton method. Furthermore, we can obtain a local linear convergence rate, under some additional assumptions, and prove that this rate is the best we can hope when using random subspace.
I graduated from the French engineering school Ensimag in 2007. In 2010, I did a one master internship at the Optimization Laboratory of Kyoto University.During my Ph.D. (graduated in 2013), I worked on Robust Optimization, under the supervision of Marie-Christine Costa and Alain Billionnet, at CEDRIC and UMA laboratories.
I did a Post-doc at LIX – Ecole Polytechnique, under the supervision of Leo Liberti, working on Network Optimization and Bilevel programming for the SoGrid project and working on some new probabilistic methods based on the measure concentration phenomenon to solve large scale optimization problems. I did an other Postdoc at Ensta-Paristech, working on a PGMO project about robust Steiner tree problems. In January 2017, I joined Huawei Technologies as a research scientist and worked on optimization problems in networks; then I moved in Japan and worked as an R&D Engineer on Machine Learning at a Japanese company. Since September 2018 I am working at RIKEN – AIP, as a researcher, in the Continuous Optimization Team led by Prof. Takeda.