Accelerating Heterogeneous Federated Learning with Closed-form Classifiers

ICML(2024)

引用 0|浏览40
摘要
Federated Learning (FL) methods often struggle in highly statisticallyheterogeneous settings. Indeed, non-IID data distributions cause client driftand biased local solutions, particularly pronounced in the final classificationlayer, negatively impacting convergence speed and accuracy. To address thisissue, we introduce Federated Recursive Ridge Regression (Fed3R). Our methodfits a Ridge Regression classifier computed in closed form leveragingpre-trained features. Fed3R is immune to statistical heterogeneity and isinvariant to the sampling order of the clients. Therefore, it provesparticularly effective in cross-device scenarios. Furthermore, it is fast andefficient in terms of communication and computation costs, requiring up to twoorders of magnitude fewer resources than the competitors. Finally, we proposeto leverage the Fed3R parameters as an initialization for a softmax classifierand subsequently fine-tune the model using any FL algorithm (Fed3R withFine-Tuning, Fed3R+FT). Our findings also indicate that maintaining a fixedclassifier aids in stabilizing the training and learning more discriminativefeatures in cross-device settings. Official website: https://fed-3r.github.io/.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
0
您的评分 :

暂无评分

数据免责声明
页面数据均来自互联网公开来源、合作出版商和通过AI技术自动分析结果,我们不对页面数据的有效性、准确性、正确性、可靠性、完整性和及时性做出任何承诺和保证。若有疑问,可以通过电子邮件方式联系我们:report@aminer.cn