BiCoLoR: Communication-Efficient Optimization with Bidirectional Compression and Local Training
We introduce BiCoLoR, the first algorithm to combine local training with bidirectional compression using arbitrary unbiased compressors, achieving accelerated complexity and demonstrating superior empirical performance.
Overview
Slow and costly communication is often the main bottleneck in distributed optimization, especially in federated learning, where it takes place over the air. We introduce BiCoLoR, a communication-efficient optimization algorithm. It harnesses the two widely used and efficient techniques of local training, which consist of doing more computations between the communication steps, and compression, by which the high-dimensional vectors are encoded into short bitstreams. The two mechanisms have been combined, but with compression only applied for uplink (clients-to-server) communication, whereas downlink (server-to-client) communication is ignored. In practice, both ways are expensive.
Presenters
Brief Biography
Dr. Laurent Condat received his Ph.D. in Applied Mathematics in 2006 from Grenoble Institute of Technology, France. Following a postdoc at Helmholtz Zentrum München, Germany, he joined the French National Center for Scientific Research (CNRS) in 2008 as a permanent researcher. He worked at GREYC, Caen, before moving to GIPSA-Lab, Grenoble, in 2012. From 2016 to 2019, he served as a member of the French National Committee for Scientific Research. Since 2019, he has been on leave from the CNRS and is currently a Senior Research Scientist at KAUST.
He is a Senior Member of the IEEE and a Senior Area Editor of IEEE Transactions on Signal Processing. His recognitions include a Best Student Paper Award at IEEE ICIP 2005, the Best Ph.D. Award from Grenoble Institute of Technology, and several Meritorious Reviewer Awards. Since 2020, he has been listed among the world’s top 2% most influential scientists by Stanford University.