论文标题

差异私人联盟学习中安全汇总的基本价格

The Fundamental Price of Secure Aggregation in Differentially Private Federated Learning

论文作者

Chen, Wei-Ning, Choquette-Choo, Christopher A., Kairouz, Peter, Suresh, Ananda Theertha

论文摘要

我们考虑使用具有分布式差异隐私(DP)的训练A $ D $维度模型的问题,在该模型中,使用安全聚合(SECAGG)来确保服务器仅在每次培训回合中看到$ N $模型更新的嘈杂总和。考虑到Secagg施加的限制,我们表征了获得在$ \ varepsilon $ Central DP(即在完全信任的服务器和没有通信约束下)获得最佳准确性所需的基本通信成本。我们的结果表明,每个客户端的$ \ tilde {o} \ left(\ min(n^2 \ varepsilon^2,d)\ right)$ bits均已足够且必要,并且该基本限制可以通过基于稀疏随机项目的线性方案实现。相对于使用$ \ tilde {o}(d \ log(d/\ varepsilon^2))$ bits $ bits的最新secagg分布式DP方案,这提供了显着改进。 从经验上讲,我们对现实世界联合学习任务的建议计划评估了我们的计划。我们发现,在实践中,我们的理论分析非常匹配。特别是,我们表明,在现实的隐私设置中,我们可以将通信成本显着降低到每个参数$ 1.2 $的位置,而不会降低测试时间的性能。因此,我们的工作理论上和经验指定了使用Secagg的基本价格。

We consider the problem of training a $d$ dimensional model with distributed differential privacy (DP) where secure aggregation (SecAgg) is used to ensure that the server only sees the noisy sum of $n$ model updates in every training round. Taking into account the constraints imposed by SecAgg, we characterize the fundamental communication cost required to obtain the best accuracy achievable under $\varepsilon$ central DP (i.e. under a fully trusted server and no communication constraints). Our results show that $\tilde{O}\left( \min(n^2\varepsilon^2, d) \right)$ bits per client are both sufficient and necessary, and this fundamental limit can be achieved by a linear scheme based on sparse random projections. This provides a significant improvement relative to state-of-the-art SecAgg distributed DP schemes which use $\tilde{O}(d\log(d/\varepsilon^2))$ bits per client. Empirically, we evaluate our proposed scheme on real-world federated learning tasks. We find that our theoretical analysis is well matched in practice. In particular, we show that we can reduce the communication cost significantly to under $1.2$ bits per parameter in realistic privacy settings without decreasing test-time performance. Our work hence theoretically and empirically specifies the fundamental price of using SecAgg.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源