Biography

I am a PhD student since September 2021 under the supervision of Laurent Massoulié, working on the theory and algorithms of Machine Learning. I am particularly interested in optimization (in all its form: distributed, decentralized, stochastic,…), topics related to Federated Learning, gossip algorithms and high-dimensional statistics, deep learning, and the theory of deep learning.

During my PhD, I was a TA for courses at ENS Paris and at the M2 MASH held at PSL-Dauphine, that include Deep Learning courses (teachers: Marc Lelarge and Kevin Scaman) and Network Algorithms courses (teacher: Ana Busic).

I interned at Microsoft Reasearch (Redmond, WA) in the Foundations of Machine Learning group in Fall 2022 and at EPLF in the TML team led by Nicolas Flammarion in Fall 2023.

Interests
  • Optimization
  • Statistics
Education
  • Master Program in Probability and Statistics, 2020

    Université d'Orsay/Paris-Saclay

  • Mathematics and Physics studies

    Ecole Normale Supérieure de Paris

Publications

(2023). Implicit Regularisation, Large Stepsizes and Edge of Stability for (S)GD over Diagonal Linear Networks. Neural Information Processing Systems (NeurIPS), 2023.

PDF

(2023). Stochastic Gradient Descent under Markov Chain Sampling Schemes. International conference on Machine Learning, 2023.

PDF

(2022). Asynchronous SGD Beats Minibatch SGD Under Arbitrary Delays. Neural Information Processing Systems (NeurIPS), 2022.

PDF Cite

(2022). Muffliato Peer-to-Peer Privacy Amplification for Decentralized Optimization and Averaging. Neural Information Processing Systems (NeurIPS), 2021.

PDF Cite

(2022). On Sample Optimality in Personalized Federated and Collaborative Learning. Neural Information Processing Systems (NeurIPS), 2022.

PDF Cite

(2021). A Continuized View on Nesterov Acceleration for Stochastic Gradient Descent and Randomized Gossip. NeurIPS2021 Outstanding Paper Award.

PDF Cite

(2021). Fast Stochastic Bregman Gradient Methods: Sharp Analysis and Variance Reduction. ICML2021.

PDF Cite

(2021). Asynchrony and Acceleration in Gossip Algorithms.

PDF Cite

Conferences and Workshops

- Prairie Workshop, Paris, October 10, 2021.

- Workshop On Future Synergies for Stochastic and Learning Algorithms, CIRM, Luminy, Sept. 27-Oct. 1, 2021.

- Conference on Learning Theory (COLT), August 15-19, 2021. (talk,slides,poster)

- International Conference on Machine Learning (ICML), July 18-24, 2021. (spotlight presentation)

Teaching

- Mathematics of Deep-Learning, ENS Paris, Fall 2021. Master course, TDs with Kevin Scaman. Course Website

- Network Algorithms, ENS Paris, Fall 2021. Master course, TDs with Ana Busic.

Contact