Since September 2020, I am an assistant professor at ENSAE/ CREST in the Statistics Department.
My main line of research is in statistical machine learning. I have been working on kernel methods, optimal transport and ranking data. Currently, I am particularly interested in dynamical particle systems for ML and kernel-based methods for causal inference.
From December 2018 to August 2020 I was a postdoctoral researcher at Gatsby Unit, University College London (UCL), working with Arthur Gretton.
From October 2015 to October 2018, I was a PhD student at Télécom ParisTech, in the S2A (Signal, Statistics and Learning) team, supervised by Stephan Clémençon .
Before that in 2015, I graduated the Master MVA (Machine Learning and Computer Vision) from ENS Cachan and ENSAE.
More details can be found in my resume [EN] [FR].
- October 2021: New preprint with François Portier on Adaptive Importance Sampling !
- Fall 2021: I am very happy to participate to the program Geometric Methods in Optimization and Sampling at UC Berkeley.
- July 2021: Two papers accepted at ICML 2021! Kernel Stein Discrepancy Descent for a long oral and Proximal Learning with Kernels for a short oral. See here for ressources. Thanks to my amazing coauthors!
- April 2021: I'll give a talk at the ITW session organized by Gergely Neu. The program can be found here.
- February 2021: Gave a talk at Heriot-Watt university on our paper about SVGD (Neurips 2020)[slides]
- October 2020: Gave a talk at IHP on our two Neurips 2020 papers on SVGD and the FB scheme for sampling (slides).
- September 2020: Gave two talks at the Second Symposium on Machine Learning and Dynamical Systems at Fields Institute.
- Maximum Mean Discrepancy Gradient flow (Neurips 2019)
- Wasserstein Gradient Flows for Machine Learning. Joint talk with Adil Salim about our recent line of work.