I am a PhD student since September 2021 under the supervision of Laurent Massoulié, working on the theory and algorithms of Machine Learning. I am particularly interested in optimization (in all its form: distributed, decentralized, stochastic,…), topics related to Federated Learning, gossip algorithms and high-dimensional statistics.

  • Optimization
  • Statistics
  • Surfing
  • Hiking
  • Master Program in Probability and Statistics, 2020

    Université d'Orsay/Paris-Saclay

  • Mathematics and Physics studies

    Ecole Normale Supérieure de Paris


(2023). Implicit Regularisation, Large Stepsizes and Edge of Stability for (S)GD over Diagonal Linear Networks. preprint.


(2022). Asynchronous SGD Beats Minibatch SGD Under Arbitrary Delays. Neural Information Processing Systems (NeurIPS), 2022.

PDF Cite

(2022). Muffliato Peer-to-Peer Privacy Amplification for Decentralized Optimization and Averaging. Neural Information Processing Systems (NeurIPS), 2021.

PDF Cite

(2022). On Sample Optimality in Personalized Federated and Collaborative Learning. Neural Information Processing Systems (NeurIPS), 2022.

PDF Cite

(2021). A Continuized View on Nesterov Acceleration for Stochastic Gradient Descent and Randomized Gossip. NeurIPS2021 Outstanding Paper Award.

PDF Cite

(2021). Fast Stochastic Bregman Gradient Methods: Sharp Analysis and Variance Reduction. ICML2021.

PDF Cite

(2021). Asynchrony and Acceleration in Gossip Algorithms.

PDF Cite

Conferences and Workshops

- Prairie Workshop, Paris, October 10, 2021.

- Workshop On Future Synergies for Stochastic and Learning Algorithms, CIRM, Luminy, Sept. 27-Oct. 1, 2021.

- Conference on Learning Theory (COLT), August 15-19, 2021. (talk,slides,poster)

- International Conference on Machine Learning (ICML), July 18-24, 2021. (spotlight presentation)


- Mathematics of Deep-Learning, ENS Paris, Fall 2021. Master course, TDs with Kevin Scaman. Course Website

- Network Algorithms, ENS Paris, Fall 2021. Master course, TDs with Ana Busic.