论文标题
联合学习中的及时沟通
Timely Communication in Federated Learning
论文作者
论文摘要
我们考虑了一个联合学习框架,其中参数服务器(PS)通过使用$ N $客户端训练全局模型,而无需将客户端数据集中在云服务器中。专注于客户数据集在本质上快速变化且高度时间的设置,我们研究了模型更新的及时性,并提出了一种新颖的及时通信方案。根据拟议的计划,在每次迭代中,PS等待$ M $可用的客户端,并将当前型号发送给他们。然后,PS使用$ m $客户端的最早$ k $的本地更新来在每次迭代中更新全局模型。我们发现每个客户所经历的信息年龄的平均信息,并以数值为特定的$ n $的年龄段$ m $和$ k $值的数字表征。我们的结果表明,除了确保及时性外,与随机客户选择相比,提出的通信方案还会导致平均迭代时间明显较小,而不会伤害全球学习任务的收敛性。
We consider a federated learning framework in which a parameter server (PS) trains a global model by using $n$ clients without actually storing the client data centrally at a cloud server. Focusing on a setting where the client datasets are fast changing and highly temporal in nature, we investigate the timeliness of model updates and propose a novel timely communication scheme. Under the proposed scheme, at each iteration, the PS waits for $m$ available clients and sends them the current model. Then, the PS uses the local updates of the earliest $k$ out of $m$ clients to update the global model at each iteration. We find the average age of information experienced by each client and numerically characterize the age-optimal $m$ and $k$ values for a given $n$. Our results indicate that, in addition to ensuring timeliness, the proposed communication scheme results in significantly smaller average iteration times compared to random client selection without hurting the convergence of the global learning task.