论文标题

通过合成数据联合学习

Federated Learning via Synthetic Data

论文作者

Goetz, Jack, Tewari, Ambuj

论文摘要

联合学习允许使用多个客户端上的数据培训模型,而无需传输原始数据。但是,标准方法是传输模型参数(或更新),对于现代神经网络而言,该参数可能是数百万参数的规模,对客户造成了巨大的计算成本。我们提出了一种用于联合学习的方法,在这种方法中,我们没有将梯度更新传输回服务器,而是传输了少量的合成“数据”。我们描述了该过程并显示了一些实验结果,这表明该过程具有潜力,而不仅仅是降低通信成本的数量级,并且模型降级最小。

Federated learning allows for the training of a model using data on multiple clients without the clients transmitting that raw data. However the standard method is to transmit model parameters (or updates), which for modern neural networks can be on the scale of millions of parameters, inflicting significant computational costs on the clients. We propose a method for federated learning where instead of transmitting a gradient update back to the server, we instead transmit a small amount of synthetic `data'. We describe the procedure and show some experimental results suggesting this procedure has potential, providing more than an order of magnitude reduction in communication costs with minimal model degradation.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源