Mathieu Even

Mathieu Even

Postdoc at Inria Montpellier

Inria Montpellier, PREMEDICAL Research Project team

Biography

Starting October 1st, 2025, I am a researcher (CR, Chargé de Recherche) at Inria Montpellier, in the Premedical team. I am working on topics related to causal inference and machine learning applied to medical data. I also work with the startup Theremia Health.

Before that, I was a PhD student from September 2021 to June 2024 under the supervision of Laurent Massoulié, working on the theory and algorithms of Machine Learning, and a postdoc in the Premedical team between October 2024 and September 2025. My PhD manuscript is available here, for which I was awarded an accessit to the Gilles-Kahn PhD prize by the Société d’Informatique de France, sponsored by the French Académie des Sciences.

Interests
  • Optimization
  • Statistics
  • Causal Inference
  • Medical Applications
Education
  • PhD in Applied Mathematics and Computer Science (2021-2024)

    Inria Paris and ENS Paris

  • Master Program in Probability and Statistics, 2020

    Université d'Orsay/Paris-Saclay

  • Mathematics and Physics studies (2017-2021)

    Ecole Normale Supérieure de Paris

Publications

(2025). Model Agnostic Differentially Private Causal Inference.

PDF

(2024). Long-Context Linear System Identification.

PDF

(2024). Asynchronous speedup in decentralized optimization. IEEE Transactions on Automatic Control.

PDF Cite

(2024). Asynchronous SGD on Graphs - a Unified Framework for Asynchronous Decentralized and Federated Optimization. AISTATS, 2024.

PDF

(2023). Aligning Embeddings and Geometric Random Graphs, Informational Results and Computational Approaches for the Procrustes-Wasserstein Problem. Neural Information Processing Systems (NeurIPS), 2023.

PDF

(2023). Implicit Regularisation, Large Stepsizes and Edge of Stability for (S)GD over Diagonal Linear Networks. Neural Information Processing Systems (NeurIPS), 2023.

PDF

(2023). Stochastic Gradient Descent under Markov Chain Sampling Schemes. International conference on Machine Learning, 2023.

PDF

(2022). Asynchronous SGD Beats Minibatch SGD Under Arbitrary Delays. Neural Information Processing Systems (NeurIPS), 2022.

PDF Cite

(2022). Muffliato Peer-to-Peer Privacy Amplification for Decentralized Optimization and Averaging. Neural Information Processing Systems (NeurIPS), 2022.

PDF Cite

(2022). On Sample Optimality in Personalized Federated and Collaborative Learning. Neural Information Processing Systems (NeurIPS), 2022.

PDF Cite

(2021). A Continuized View on Nesterov Acceleration for Stochastic Gradient Descent and Randomized Gossip. NeurIPS2021 Outstanding Paper Award.

PDF Cite

(2021). Fast Stochastic Bregman Gradient Methods: Sharp Analysis and Variance Reduction. ICML2021.

PDF Cite

Conferences and Workshops

- Prairie Workshop, Paris, October 10, 2021.

- Workshop On Future Synergies for Stochastic and Learning Algorithms, CIRM, Luminy, Sept. 27-Oct. 1, 2021.

- Conference on Learning Theory (COLT), August 15-19, 2021. (talk,slides,poster)

- International Conference on Machine Learning (ICML), July 18-24, 2021. (spotlight presentation)

Teaching

- Mathematics of Deep-Learning, ENS Paris, Fall 2021. Master course, TDs with Kevin Scaman. Course Website

- Network Algorithms, ENS Paris, Fall 2021. Master course, TDs with Ana Busic.

Contact