About

I’m a student at the King Abdullah University of Science and Technology, where I’m currently studying Computer Science and doing research in the Visual Computing Center under the supervision of Peter Richtárik. Prior to this I got my BS degree in Applied Mathematics and Physics at the Moscow Institute of Physics and Technology.

Interests

  • Optimization
  • Machine Learning

Education

  • PhD in Computer Science, in progress
    King Abdullah University of Science and Technology
  • MS in Applied Mathematics and Physics, 2021
    MOSCOW INSTITUTE OF PHYSICS AND TECHNOLOGY
  • MS in Computer Science, 2019
    King Abdullah University of Science and Technology
  • BS in Applied Mathematics and Physics, 2018
    Moscow Institute of Physics and Technology

Recent Papers

Optimal Decentralized Algorithms for Saddle Point Problems over Time-Varying Networks

13 Jul 2021

Lower Bounds and Optimal Algorithms for Smooth and Strongly Convex Decentralized Optimization Over Time-Varying Networks

08 Jun 2021

An Optimal Algorithm for Strongly Convex Minimization under Affine Constraints

22 Feb 2021

ADOM: Accelerated Decentralized Optimization Method for Time-Varying Networks

18 Feb 2021

IntSGD: Floatless Compression of Stochastic Gradients

16 Feb 2021

Decentralized Distributed Optimization for Saddle Point Problems

15 Feb 2021

A Linearly Convergent Algorithm for Decentralized Optimization: Sending Less Bits for Free!

03 Nov 2020

Linearly Converging Error Compensated SGD

23 Oct 2020

Towards Accelerated Rates for Distributed Optimization over Time-varying Networks

23 Sep 2020

Optimal and Practical Algorithms for Smooth and Strongly Convex Decentralized Optimization

21 Jun 2020

From Local SGD to Local Fixed Point Methods for Federated Learning

03 Apr 2020

Acceleration for Compressed Gradient Descent in Distributed and Federated Optimization

26 Feb 2020

Fast Linear Convergence of Randomized BFGS

26 Feb 2020

Variance Reduced Coordinate Descent with Acceleration: New Method With a Surprising Application to Finite-Sum Problems

11 Feb 2020

Distributed Fixed Point Methods with Compressed Iterates

20 Dec 2019

Accelerated methods for composite non-bilinear saddle point problem

09 Dec 2019

Stochastic Newton and Cubic Newton Methods with Simple Local Linear-Quadratic Rates

03 Dec 2019

Stochastic Proximal Langevin Algorithm: Potential Splitting and Nonasymptotic Rates

28 May 2019

RSN: Randomized Subspace Newton

26 May 2019

Stochastic Distributed Learning with Gradient Quantization and Variance Reduction

04 Apr 2019

Don't Jump Through Hoops and Remove Those Loops: SVRG and Katyusha are Better Without the Outer Loop

24 Jan 2019

A hypothesis about the rate of global convergence for optimal methods (Newton's type) in smooth convex optimization

28 Feb 2018

Stochastic Spectral and Conjugate Descent Methods

11 Feb 2018

Recent Posts

ICCOPT 2019

I gave a talk "Revisiting Stochastic Extragradient Method" which was a part of "Variational Inequalities, Minimax Problems and GANs" session organised by Konstantin Mishchenko.

05 Aug 2019

Data Science Summer School 2019

I'm presenting poster at Data Science Summer School at École polytechnique, Paris, France.

26 Jun 2019

June 2019 plans

I'm staying in Moscow until June 16. I will attend "Control, Information, Optimization" summer school in Voronovo, Moscow region from June 17 to June 22. Then I will attend Data Science Summer School in Paris, France from June 24 to June 28.

11 Jun 2019

Talk: Stochastic Spectral and Conjugate Descent Methods

Today I gave a talk at the "Automatic control and Optimization Theory" seminar at the Institute for Control Problems, Moscow. The talk is based on my NeurIPS 2018 paper.

26 Mar 2019

Moscow visit

I’m staying in Moscow from 18 March to 2 April. I'm going to give 2 talks and have a rest from work.

18 Mar 2019

Contact