Anna Korba
About me
Since September 2020, I am an assistant professor at ENSAE/ CREST in the Statistics Department.
My main line of research is machine learning. I have been working on kernel methods, optimal transport, optimisation, particle systems and preference learning. At the moment I am particularly interested in sampling and optimisation methods.
In 2025, I have been awarded an ERC Starting Grant for my project Optinfinite.
Team
With my academic activities, I am lucky to work closely with the following people:
- Yedidia Agnimo, PhD, co-advised with Karteek Alahari and Nicolas Chesneau
- Imad Aouali, PhD, co-advised with Victor-Emmanuel Brunel and David Rohde
- Paul Caucheteux, PhD
- Clémentine Chazal, PhD
- Marguerite Petit-Talamon, PhD
- Christophe Vauthier, PhD, co-advised with Quentin Mérigot
- Adrien Vacher, postdoctoral researcher
- Oussama Zekri, PhD
Alumni
News
- September 2025: Two papers accepted at Neurips. The first one (paper) on variational inference with mixtures of isotropic Gaussians, with Marguerite Petit-Talamon (CREST, ENSAE) and Marc Lambert (INRIA). The second one (here) on the complexity of sampling (from an unnormalized density) with reverse diffusions, with Adrien Vacher and Omar Chehab.
- September 2025: Novel preprint on A Computable Measure of Suboptimality for Entropy-Regularised Variational Objectives. Part of this work was carried out during Clémentine Chazal's (my phd student) visit to University of Newcastle as a visiting fellow, hosted and working with Chris Oates, Heishiro Kanagawa and Zheyang Shen. This paper introduces a way to evaluate sample approximations to optima for entropy-regularised objectives: min L(Q) + KL(Q||Q_0), a problem arising in the emerging post-Bayesian paradigm and in the mean-field analysis of neural networks. At the end of the day, we obtain and use sample approximations, but their KL is undefined (objective cannot be computed).
This work thus presents a computable alternative termed 'gradient discrepancies', which turns out to generalise the kernel Stein discrepancy!
- July 2025: Novel preprint with Arturo Castellanos, Pavlo Mozharovskyi, Hicham Janati. We propose a novel distance between probability distributions, based on comparing them through the Schatten norm of their kernel covariance operators.
- July 2025 (ICML): Clément Bonet, Christophe Vauthier and Omar Chehab presented our work on flowing datasets with Wasserstein over Wasserstein flows, properties of Sliced-Wasserstein flow, and density ratio estimation.
Bio
From December 2018 to August 2020 I was a postdoctoral researcher at Gatsby Unit, University College London (UCL), working with Arthur Gretton.
From October 2015 to October 2018, I was a PhD student at Télécom ParisTech, in the S2A (Signal, Statistics and Learning) team, supervised by Stephan Clémençon .
Before that in 2015, I graduated the Master MVA (Machine Learning and Computer Vision) from ENS Cachan and ENSAE.
More details can be found in my resume [EN].