论文标题

与本地正规化和稀疏有关的私人联盟学习

Differentially Private Federated Learning with Local Regularization and Sparsification

论文作者

Cheng, Anda, Wang, Peisong, Zhang, Xi Sheryl, Cheng, Jian

论文摘要

用户级差异隐私(DP)为联合学习中任何用户数据所特有的信息提供了可认证的隐私保证。确保用户级DP的现有方法以严重准确的降低为代价。在本文中,我们研究了在用户级DP保证下联邦学习中模型绩效降低的原因。我们发现解决此问题的关键是在执行保证DP的操作之前自然限制本地更新的规范。为此,我们提出了两种技术,有界的本地更新正则化和本地更新稀疏,以提高模型质量而不牺牲隐私。我们提供有关框架融合并提供严格隐私保证的理论分析。广泛的实验表明,我们的框架可大大改善使用用户级DP保证的联合学习的最先进的私密性权衡。

User-level differential privacy (DP) provides certifiable privacy guarantees to the information that is specific to any user's data in federated learning. Existing methods that ensure user-level DP come at the cost of severe accuracy decrease. In this paper, we study the cause of model performance degradation in federated learning under user-level DP guarantee. We find the key to solving this issue is to naturally restrict the norm of local updates before executing operations that guarantee DP. To this end, we propose two techniques, Bounded Local Update Regularization and Local Update Sparsification, to increase model quality without sacrificing privacy. We provide theoretical analysis on the convergence of our framework and give rigorous privacy guarantees. Extensive experiments show that our framework significantly improves the privacy-utility trade-off over the state-of-the-arts for federated learning with user-level DP guarantee.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源