EF-BV: distributed optimization with compressed communication
In the big data era, it is necessary to rely on distributed computing. For distributed optimization and learning tasks, in particular in the modern paradigm of federated learning, specific challenges arise, such as decentralized data storage. Communication between the parallel machines and the orchestrating distant server is necessary but slow. To address this main bottleneck, a natural strategy is to compress the communicated vectors. I will present EF-BV, a new algorithm which converges linearly to an exact solution, with a large class of deterministic or random, biased or unbiased compressors.
Overview
Abstract
In the big data era, it is necessary to rely on distributed computing. For distributed optimization and learning tasks, in particular in the modern paradigm of federated learning, specific challenges arise, such as decentralized data storage. Communication between the parallel machines and the orchestrating distant server is necessary but slow. To address this main bottleneck, a natural strategy is to compress the communicated vectors. I will present EF-BV, a new algorithm which converges linearly to an exact solution, with a large class of deterministic or random, biased or unbiased compressors. The corresponding paper "EF-BV: A unified theory of error feedback and variance reduction for biased and unbiased compression in distributed optimization" has just been accepted at NeurIPS 2022.
Brief Biography
Laurent Condat is a Research Scientist working with Professor Peter Richtarik at the Visual Computing Center since November 2019. Dr. Condat's area of interest includes convex optimization models and algorithms as well as signal and image processing. He has co-authored more than 100 articles on these topics and he is a senior member of the IEEE.