Abstract
Non-convex Machine Learning problems typically do not adhere to the standard smoothness assumption. Based on empirical findings a more realistic generalized smoothness assumption was proposed, though it remains largely unexplored. Many existing algorithms designed for standard smooth problems need to be revised. In this paper we propose and analyze new Federated Learning methods with local steps, partial participation of clients, and Random Reshuffling without extra restrictive assumptions beyond generalized smoothness. Our theory is consistent with the known results for standard smooth problems, and our experimental results support the theoretical insights.
Brief Biography
Yury Demidovich earned his PhD from the Moscow Institute of Physics and Technology, Department of Innovation and High Technology, from 2017 to 2021. Prior to that, he obtained a Master’s degree in Fundamental Mathematics and Mechanics from Lomonosov Moscow State University, where he studied from 2011 to 2017.