论文标题

使用随机优化的沟通效率设备调度用于联合学习

Communication-Efficient Device Scheduling for Federated Learning Using Stochastic Optimization

论文作者

Perazzone, Jake, Wang, Shiqiang, Ji, Mingyue, Chan, Kevin

论文摘要

联合学习(FL)是分布式机器学习的有用工具,它以隐私的方式利用用户的本地数据集。当在受约束的无线环境中部署FL时;但是,由于设备的间歇连通性,异质连接质量和非I.I.I.D.数据。在本文中,我们在I.I.D上使用FL提供了非凸损失函数的新型收敛分析。和non-i.i.d。每个回合具有任意设备选择概率的数据集。然后,使用派生的收敛界限,我们使用随机优化来开发新的客户端选择和功率分配算法,该算法最小化收敛界限的函数和在传输功率约束下的平均通信时间。我们找到了最小化问题的分析解决方案。该算法的一个关键特征是,不需要了解通道统计信息,并且只需要知道瞬时通道状态信息。使用女性和CIFAR-10数据集,我们通过模拟显示,与统一的随机参与相比,使用我们的算法可以显着减少通信时间。

Federated learning (FL) is a useful tool in distributed machine learning that utilizes users' local datasets in a privacy-preserving manner. When deploying FL in a constrained wireless environment; however, training models in a time-efficient manner can be a challenging task due to intermittent connectivity of devices, heterogeneous connection quality, and non-i.i.d. data. In this paper, we provide a novel convergence analysis of non-convex loss functions using FL on both i.i.d. and non-i.i.d. datasets with arbitrary device selection probabilities for each round. Then, using the derived convergence bound, we use stochastic optimization to develop a new client selection and power allocation algorithm that minimizes a function of the convergence bound and the average communication time under a transmit power constraint. We find an analytical solution to the minimization problem. One key feature of the algorithm is that knowledge of the channel statistics is not required and only the instantaneous channel state information needs to be known. Using the FEMNIST and CIFAR-10 datasets, we show through simulations that the communication time can be significantly decreased using our algorithm, compared to uniformly random participation.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源