论文标题

通过朋友模型替换在联邦学习中打击客户辍学

Combating Client Dropout in Federated Learning via Friend Model Substitution

论文作者

Wang, Heqiang, Xu, Jie

论文摘要

联合学习(FL)是一个新的分布式机器学习框架,以其在数据隐私和通信效率方面的好处而闻名。由于由于资源的限制,在许多情况下进行了全面的客户参与是不可行的,因此已经调查了部分参与算法,该算法主动选择/采样了一部分客户的子集,旨在实现与全面参与案例接近的学习绩效。本文研究了一种被动的部分客户参与方案,该场景知之甚少,其中部分参与是外部事件的结果,即客户辍学,而不是FL算法的决定。我们将fl与客户辍学者一起作为特殊情况,即客户可以提交替代(可能不准确的)本地模型更新的特殊情况。基于我们的收敛分析,我们开发了一种新的算法FL-FDM,该算法会发现客户的朋友(即数据分布相似的客户),并使用朋友的本地更新来替代辍学客户,从而减少了替代错误并改善了逆转性能。复杂性降低机制也被纳入FL-FDMS,使其在理论上是合理的且实际上有用的。 MNIST和CIFAR-10的实验证实了FL-FDM在处理FL中辍学时的出色性能。

Federated learning (FL) is a new distributed machine learning framework known for its benefits on data privacy and communication efficiency. Since full client participation in many cases is infeasible due to constrained resources, partial participation FL algorithms have been investigated that proactively select/sample a subset of clients, aiming to achieve learning performance close to the full participation case. This paper studies a passive partial client participation scenario that is much less well understood, where partial participation is a result of external events, namely client dropout, rather than a decision of the FL algorithm. We cast FL with client dropout as a special case of a larger class of FL problems where clients can submit substitute (possibly inaccurate) local model updates. Based on our convergence analysis, we develop a new algorithm FL-FDMS that discovers friends of clients (i.e., clients whose data distributions are similar) on-the-fly and uses friends' local updates as substitutes for the dropout clients, thereby reducing the substitution error and improving the convergence performance. A complexity reduction mechanism is also incorporated into FL-FDMS, making it both theoretically sound and practically useful. Experiments on MNIST and CIFAR-10 confirmed the superior performance of FL-FDMS in handling client dropout in FL.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源