Publications

(2023). Implicit Regularisation, Large Stepsizes and Edge of Stability for (S)GD over Diagonal Linear Networks. Neural Information Processing Systems (NeurIPS), 2023.

PDF

(2023). Stochastic Gradient Descent under Markov Chain Sampling Schemes. International conference on Machine Learning, 2023.

PDF

(2022). Muffliato Peer-to-Peer Privacy Amplification for Decentralized Optimization and Averaging. Neural Information Processing Systems (NeurIPS), 2021.

PDF Cite

(2022). Asynchronous SGD Beats Minibatch SGD Under Arbitrary Delays. Neural Information Processing Systems (NeurIPS), 2022.

PDF Cite

(2022). On Sample Optimality in Personalized Federated and Collaborative Learning. Neural Information Processing Systems (NeurIPS), 2022.

PDF Cite

(2021). A Continuized View on Nesterov Acceleration for Stochastic Gradient Descent and Randomized Gossip. NeurIPS2021 Outstanding Paper Award.

PDF Cite

(2021). Fast Stochastic Bregman Gradient Methods: Sharp Analysis and Variance Reduction. ICML2021.

PDF Cite

(2021). Asynchrony and Acceleration in Gossip Algorithms.

PDF Cite