Communication-Efficient Heterogeneous Federated Learning with Generalized Heavy-Ball Momentum
CoRR(2023)
摘要
Federated Learning (FL) has emerged as the state-of-the-art approach forlearning from decentralized data in privacy-constrained scenarios. However,system and statistical challenges hinder real-world applications, which demandefficient learning from edge devices and robustness to heterogeneity. Despitesignificant research efforts, existing approaches (i) are not sufficientlyrobust, (ii) do not perform well in large-scale scenarios, and (iii) are notcommunication efficient. In this work, we propose a novel GeneralizedHeavy-Ball Momentum (GHBM), motivating its principled application to counteractthe effects of statistical heterogeneity in FL. Then, we present FedHBM as anadaptive, communication-efficient by-design instance of GHBM. Extensiveexperimentation on vision and language tasks, in both controlled and realisticlarge-scale scenarios, provides compelling evidence of substantial andconsistent performance gains over the state of the art.
更多查看译文
关键词
Federated Learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
数据免责声明
页面数据均来自互联网公开来源、合作出版商和通过AI技术自动分析结果,我们不对页面数据的有效性、准确性、正确性、可靠性、完整性和及时性做出任何承诺和保证。若有疑问,可以通过电子邮件方式联系我们:report@aminer.cn