Event Start
Event End
Location
Building 9, Level 3, Room 3125
Senior Research Scientist,
Visual Computing Center
Abstract
In distributed optimization, and even more in federated learning, communication is the main bottleneck. We introduce LoCoDL, a communication-efficient algorithm that leverages the two techniques of Local training, which reduces the communication frequency, and Compression with a large class of unbiased compressors that includes sparsification and quantization strategies. LoCoDL provably benefits from the two mechanisms and enjoys a doubly-accelerated communication complexity, with respect to the condition number of the functions and the model dimension, in the general heterogenous regime with strongly convex functions.
Brief Biography
Read more about Laurent here.
Contact Person
Related Persons
Senior Research Scientist,
Visual Computing Center