Dmitry Kovalev

Senior Researcher

Bio

I'm a senior researcher at Yandex Research studying optimization algorithms. Before joining Yandex I did a postdoc at the Université catholique de Louvain with Yurii Nesterov. I received my PhD in Computer Science at the King Abdullah University of Science and Technology, where I worked in the Visual Computing Center under the supervision of Peter Richtárik. Prior to that I got my BS degree in Applied Mathematics and Physics at the Moscow Institute of Physics and Technology under the supervision of Alexander Gasnikov.

Interests

  • Optimization
  • Federated and Distributed Learning
  • Machine Learning

Education

Experience

Senior Researcher

Yandex Research

2023 – Now • Moscow, Russia

Postdoctoral Researcher

Université catholique de Louvain

2022 – 2023 • Brussels, Belgium

Research Intern

Institute for System Programming

2021 – 2023 • Moscow, Russia

Researcher

Moscow Institute of Physics and Technology

2022Moscow, Russia

Publications

2024

 Decentralized convex optimization on time-varying networks with application to Wasserstein barycenters (Olga Yufereva, Michael Persiianov, Pavel Dvurechensky, Alexander Gasnikov, Dmitry Kovalev). Computational Management Science, 2024.
 Decentralized saddle-point problems with different constants of strong convexity and strong concavity (Dmitry Metelev, Alexander Rogozin, Alexander Gasnikov, Dmitry Kovalev). Computational Management Science, 2024.
 Decentralized saddle point problems via non-Euclidean mirror prox (Alexander Rogozin, Aleksandr Beznosikov, Darina Dvinskikh, Dmitry Kovalev, Pavel Dvurechensky, Alexander Gasnikov). Optimization Methods and Software, 2024.

2023

 Non-smooth setting of stochastic decentralized convex optimization problem over time-varying graphs (Aleksandr Lobanov, Andrew Veprikov, Georgiy Konin, Aleksandr Beznosikov, Alexander Gasnikov, Dmitry Kovalev). Computational Management Science, 2023.
 Decentralized Convex Optimization over Time-Varying Graphs (Alexander Rogozin, Alexander Gasnikov, Aleksander Beznosikov, Dmitry Kovalev). Encyclopedia of Optimization, 2023.
 Smooth monotone stochastic variational inequalities and saddle point problems: A survey (Aleksandr Beznosikov, Boris Polyak, Eduard Gorbunov, Dmitry Kovalev, Alexander Gasnikov). European Mathematical Society Magazine, 2023.
 Is consensus acceleration possible in decentralized optimization over slowly time-varying networks? (Dmitry Metelev, Alexander Rogozin, Dmitry Kovalev, Alexander Gasnikov). International Conference on Machine Learning, 2023.
 Stochastic distributed learning with gradient quantization and double-variance reduction (Samuel Horváth, Dmitry Kovalev, Konstantin Mishchenko, Peter Richtárik, Sebastian Stich). Optimization Methods and Software, 2023.

2022

 Accelerated primal-dual gradient method for smooth and convex-concave saddle-point problems with bilinear coupling (Dmitry Kovalev, Alexander Gasnikov, Peter Richtárik). Advances in Neural Information Processing Systems, 2022.
 Communication acceleration of local gradient methods via an accelerated primal-dual algorithm with an inexact prox (Abdurakhmon Sadiev, Dmitry Kovalev, Peter Richtárik). Advances in Neural Information Processing Systems, 2022.
 Optimal algorithms for decentralized stochastic variational inequalities (Dmitry Kovalev, Aleksandr Beznosikov, Abdurakhmon Sadiev, Michael Persiianov, Peter Richtárik, Alexander Gasnikov). Advances in Neural Information Processing Systems, 2022.
 Optimal gradient sliding and its application to optimal distributed optimization under similarity (Dmitry Kovalev, Aleksandr Beznosikov, Ekaterina Borodich, Alexander Gasnikov, Gesualdo Scutari). Advances in Neural Information Processing Systems, 2022.
 The first optimal acceleration of high-order methods in smooth convex optimization (Dmitry Kovalev, Alexander Gasnikov). Advances in Neural Information Processing Systems, 2022.
 The first optimal algorithm for smooth and strongly-convex-strongly-concave minimax optimization (Dmitry Kovalev, Alexander Gasnikov). Advances in Neural Information Processing Systems, 2022.
 Accelerated variance-reduced methods for saddle-point problems (Ekaterina Borodich, Vladislav Tominin, Yaroslav Tominin, Dmitry Kovalev, Alexander Gasnikov, Pavel Dvurechensky). EURO Journal on Computational Optimization, 2022.
 An optimal algorithm for strongly convex minimization under affine constraints (Adil Salim, Laurent Condat, Dmitry Kovalev, Peter Richtárik). International Conference on Artificial Intelligence and Statistics, 2022.
 IntSGD: Adaptive floatless compression of stochastic gradients (Konstantin Mishchenko, Bokun Wang, Dmitry Kovalev, Peter Richtárik). International Conference on Learning Representations, 2022.

2021

 Lower bounds and optimal algorithms for smooth and strongly convex decentralized optimization over time-varying networks (Dmitry Kovalev, Elnur Gasanov, Alexander Gasnikov, Peter Richtarik). Advances in Neural Information Processing Systems, 2021.
 A linearly convergent algorithm for decentralized optimization: Sending less bits for free! (Dmitry Kovalev, Anastasia Koloskova, Martin Jaggi, Peter Richtarik, Sebastian Stich). International Conference on Artificial Intelligence and Statistics, 2021.
 ADOM: accelerated decentralized optimization method for time-varying networks (Dmitry Kovalev, Egor Shulgin, Peter Richtárik, Alexander V Rogozin, Alexander Gasnikov). International Conference on Machine Learning, 2021.
 Near-optimal decentralized algorithms for saddle point problems over time-varying networks (Aleksandr Beznosikov, Alexander Rogozin, Dmitry Kovalev, Alexander Gasnikov). Optimization and Applications: 12th International Conference, OPTIMA 2021, Petrovac, Montenegro, September 27–October 1, 2021, Proceedings 12, 2021.
 Towards accelerated rates for distributed optimization over time-varying networks (Alexander Rogozin, Vladislav Lukoshkin, Alexander Gasnikov, Dmitry Kovalev, Egor Shulgin). Optimization and Applications: 12th International Conference, OPTIMA 2021, Petrovac, Montenegro, September 27–October 1, 2021, Proceedings 12, 2021.

2020

 Linearly converging error compensated SGD (Eduard Gorbunov, Dmitry Kovalev, Dmitry Makarenko, Peter Richtárik). Advances in Neural Information Processing Systems, 2020.
 Optimal and practical algorithms for smooth and strongly convex decentralized optimization (Dmitry Kovalev, Adil Salim, Peter Richtárik). Advances in Neural Information Processing Systems, 2020.
 Don’t jump through hoops and remove those loops: SVRG and Katyusha are better without the outer loop (Dmitry Kovalev, Samuel Horváth, Peter Richtárik). Algorithmic Learning Theory, 2020.
 Accelerated methods for saddle-point problem (Mohammad S Alkousa, Alexander Vladimirovich Gasnikov, Darina Mikhailovna Dvinskikh, Dmitry A Kovalev, Fedor Sergeevich Stonyakin). Computational Mathematics and Mathematical Physics, 2020.
 Revisiting stochastic extragradient (Konstantin Mishchenko, Dmitry Kovalev, Egor Shulgin, Peter Richtárik, Yura Malitsky). International Conference on Artificial Intelligence and Statistics, 2020.
 Acceleration for compressed gradient descent in distributed and federated optimization (Zhize Li, Dmitry Kovalev, Xun Qian, Peter Richtarik). International Conference on Machine Learning, 2020.
 From local SGD to local fixed-point methods for federated learning (Grigory Malinovskiy, Dmitry Kovalev, Elnur Gasanov, Laurent Condat, Peter Richtarik). International Conference on Machine Learning, 2020.
 Variance reduced coordinate descent with acceleration: New method with a surprising application to finite-sum problems (Filip Hanzely, Dmitry Kovalev, Peter Richtarik). International Conference on Machine Learning, 2020.

2019

 RSN: randomized subspace Newton (Robert Gower, Dmitry Kovalev, Felix Lieder, Peter Richtárik). Advances in Neural Information Processing Systems, 2019.
 Stochastic proximal langevin algorithm: Potential splitting and nonasymptotic rates (Adil Salim, Dmitry Kovalev, Peter Richtárik). Advances in Neural Information Processing Systems, 2019.

2018

 Stochastic spectral and conjugate descent methods (Dmitry Kovalev, Peter Richtárik, Eduard Gorbunov, Elnur Gasanov). Advances in Neural Information Processing Systems, 2018.

Preprints

2024

 On Linear Convergence in Smooth Convex-Concave Bilinearly-Coupled Saddle-Point Optimization: Lower Bounds and Optimal Algorithms (Dmitry Kovalev, Ekaterina Borodich). arXiv preprint arXiv:2411.14601, 2024.
 Decentralized Optimization with Coupled Constraints (Demyan Yarmoshik, Alexander Rogozin, Nikita Kiselev, Daniil Dorin, Alexander Gasnikov, Dmitry Kovalev). arXiv preprint arXiv:2407.02020, 2024.
 Lower Bounds and Optimal Algorithms for Non-Smooth Convex Decentralized Optimization over Time-Varying Networks (Dmitry Kovalev, Ekaterina Borodich, Alexander Gasnikov, Dmitrii Feoktistov). arXiv preprint arXiv:2405.18031, 2024.
 Decentralized finite-sum optimization over time-varying networks (Dmitry Metelev, Savelii Chezhegov, Alexander Rogozin, Aleksandr Beznosikov, Alexander Sholokhov, Alexander Gasnikov, Dmitry Kovalev). arXiv preprint arXiv:2402.02490, 2024.

2023

 Optimal algorithm with complexity separation for strongly convex-strongly concave composite saddle point problems (Ekaterina Borodich, Georgiy Kormakov, Dmitry Kovalev, Aleksandr Beznosikov, Alexander Gasnikov). arXiv preprint arXiv:2307.12946, 2023.

2022

 An optimal algorithm for strongly convex min-min optimization (Alexander Gasnikov, Dmitry Kovalev, Grigory Malinovsky). arXiv preprint arXiv:2212.14439, 2022.
 On scaled methods for saddle point problems (Aleksandr Beznosikov, Aibek Alanov, Dmitry Kovalev, Martin Takáč, Alexander Gasnikov). arXiv preprint arXiv:2206.08303, 2022.

2021

 Decentralized distributed optimization for saddle point problems (Alexander Rogozin, Aleksandr Beznosikov, Darina Dvinskikh, Dmitry Kovalev, Pavel Dvurechensky, Alexander Gasnikov). arXiv preprint arXiv:2102.07758, 2021.

2020

 Fast linear convergence of randomized BFGS (Dmitry Kovalev, Robert M Gower, Peter Richtárik, Alexander Rogozin). arXiv preprint arXiv:2002.11337, 2020.

2019

 Distributed fixed point methods with compressed iterates (Sélim Chraibi, Ahmed Khaled, Dmitry Kovalev, Peter Richtárik, Adil Salim, Martin Takáč). arXiv preprint arXiv:1912.09925, 2019.
 Stochastic Newton and cubic Newton methods with simple local linear-quadratic rates (Dmitry Kovalev, Konstantin Mishchenko, Peter Richtárik). arXiv preprint arXiv:1912.01597, 2019.

Contact

Morozov Business Center, Room 2474
Moscow, 119021, Russia