Event Start
Event End
Location
Abstract
In this work we focus our attention on distributed optimization problems in the context where the communication time between the server and the workers is non-negligible. We obtain novel methods supporting bidirectional compression (both from the server to the workers and vice versa) that enjoy new state-of-the-art theoretical communication complexity for convex and nonconvex problems. Our bounds are the first that manage to decouple the variance/error coming from the workers-to-server and server-to-workers compression, transforming a multiplicative dependence to an additive one. Moreover, in the convex regime, we obtain the first bounds that match the theoretical communication complexity of gradient descent. Even in this convex regime, our algorithms work with biased gradient estimators, which is non-standard and requires new proof techniques that may be of independent interest. Finally, our theoretical results are corroborated through suitable experiments.
Brief Biography
Alexander Tyurin is a Postdoctoral Research Fellow at the Visual Computing Center (VCC) at King Abdullah University of Science and Technology (KAUST). He works on modern optimization tasks with Professor Peter Richtárik. He defended his Ph.D. thesis at Higher School of Economics with his former supervisor, Professor Alexander Gasnikov. Before that, he earned Bachelor degrees in Computer Science from Lomonosov Moscow State University and Master degrees in Optimization and Statistics from Higher School of Economics. Alexander Tyurin is interested in optimization methods problems, including stochastic and distributed optimization, in the context of machine learning tasks.