论文标题

FEDBR:通过减少本地学习偏见来改善对异质数据的联合学习

FedBR: Improving Federated Learning on Heterogeneous Data via Local Learning Bias Reduction

论文作者

Guo, Yongxin, Tang, Xiaoying, Lin, Tao

论文摘要

联合学习(FL)是机器从当地保存的数据中学习的一种方式,以保护客户的隐私。这通常是使用本地SGD完成的,这有助于提高沟通效率。但是,由于不同客户端设备上的数据多样性,这种方案当前受到缓慢且不稳定的收敛限制。在这项工作中,我们确定了三种未经探索的本地学习现象,这些现象可能解释了由监督FL中的本地更新引起的这些挑战。作为一种补救措施,我们提出了FedBr,这是一种新型的统一算法,可以减少当地学习偏见,以应对这些挑战。 FedBr有两个组件。第一个组件通过平衡模型的输出来帮助减少本地分类器的偏差。第二个组件有助于学习与全局功能相似的本地功能,但与从其他数据源中学到的那些功能不同。我们进行了几个实验来测试\ algopt,发现它始终超过其他SOTA FL方法。它的两个组件也单独显示性能提高。我们的代码可在https://github.com/lins-lab/fedbr上找到。

Federated Learning (FL) is a way for machines to learn from data that is kept locally, in order to protect the privacy of clients. This is typically done using local SGD, which helps to improve communication efficiency. However, such a scheme is currently constrained by slow and unstable convergence due to the variety of data on different clients' devices. In this work, we identify three under-explored phenomena of biased local learning that may explain these challenges caused by local updates in supervised FL. As a remedy, we propose FedBR, a novel unified algorithm that reduces the local learning bias on features and classifiers to tackle these challenges. FedBR has two components. The first component helps to reduce bias in local classifiers by balancing the output of the models. The second component helps to learn local features that are similar to global features, but different from those learned from other data sources. We conducted several experiments to test \algopt and found that it consistently outperforms other SOTA FL methods. Both of its components also individually show performance gains. Our code is available at https://github.com/lins-lab/fedbr.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源