论文标题

FedAdmm:联合原始二线算法允许部分参与

FedADMM: A Federated Primal-Dual Algorithm Allowing Partial Participation

论文作者

Wang, Han, Marella, Siddartha, Anderson, James

论文摘要

联合学习是分布式优化的框架,重点是沟通效率。特别是,它遵循客户服务器广播模型,并且特别有吸引力,因为它具有适应客户计算和存储资源异质性的能力,non-i.i.d。数据假设和数据隐私。我们的贡献是提供一种新的联邦学习算法Fedadmm,用于解决非平滑正规化器的非凸复合优化问题。当并非所有客户都能够在非常通用的采样模型下参与给定的通信回合时,我们证明了Fedadmm的收敛。

Federated learning is a framework for distributed optimization that places emphasis on communication efficiency. In particular, it follows a client-server broadcast model and is particularly appealing because of its ability to accommodate heterogeneity in client compute and storage resources, non-i.i.d. data assumptions, and data privacy. Our contribution is to offer a new federated learning algorithm, FedADMM, for solving non-convex composite optimization problems with non-smooth regularizers. We prove converges of FedADMM for the case when not all clients are able to participate in a given communication round under a very general sampling model.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源