Most of our research is devoted to statistical mechanics, quantum field theory and deep learning theory.
Key projects include:
- The investigation of the connection between lattice models and conformal field theories. More precisely, we are interested in rigorously describing phase transitions in lattice models in terms of conformal field theories, in revealing conformal field theory structures within lattice models, and in connecting field theories with probabilistic objects such as random curves and fields. In particular, we are aiming to construct mathematically precise bridges between these objects, thereby rigorously connecting discrete and continuous models, statistical and quantum theories, algebraic and probabilistic structures that arise in the study of phase transitions.
- The dynamics of learning, in particular that of deep neural networks during training. More precisely, we investigate the dynamics of neural networks during supervised learning (e.g. regression or classification) or unsupervised learning (e.g. generative adversarial networks) using probability, functional analysis and algebra tools. In particular, we investigate the connections with other approaches to learning (e.g. kernel methods) and what makes neural networks so powerful for a large number of tasks.