论文标题

有效和隐私保护联盟学习的群体签名

Efficient and Privacy Preserving Group Signature for Federated Learning

论文作者

Kanchan, Sneha, Jang, Jae Won, Yoon, Jun Yong, Choi, Bong Jun

论文摘要

联合学习(FL)是一种机器学习(ML)技术,旨在减少对用户数据隐私的威胁。培训是使用用户设备上的原始数据(称为客户端)进行的,只有称为梯度的培训结果被发送到服务器进行汇总并生成更新的模型。但是,我们不能假设可以使用私人信息来信任服务器,例如与数据所有者或数据源相关的元数据。因此,将客户信息隐藏在服务器中有助于减少与隐私相关的攻击。因此,客户身份的隐私以及客户数据的隐私是使此类攻击更加困难的必要条件。本文提出了基于组签名的FL的高效和隐私权协议。一个称为GSFL的联合学习的新组签名不仅旨在保护客户数据和身份的隐私,而且还大大降低了考虑到联合学习的迭代过程的计算和通信成本。我们表明,在计算,通信和信号成本方面,GSFL优于现有方法。另外,我们表明,所提出的协议可以在联合学习环境中处理各种安全攻击。

Federated Learning (FL) is a Machine Learning (ML) technique that aims to reduce the threats to user data privacy. Training is done using the raw data on the users' device, called clients, and only the training results, called gradients, are sent to the server to be aggregated and generate an updated model. However, we cannot assume that the server can be trusted with private information, such as metadata related to the owner or source of the data. So, hiding the client information from the server helps reduce privacy-related attacks. Therefore, the privacy of the client's identity, along with the privacy of the client's data, is necessary to make such attacks more difficult. This paper proposes an efficient and privacy-preserving protocol for FL based on group signature. A new group signature for federated learning, called GSFL, is designed to not only protect the privacy of the client's data and identity but also significantly reduce the computation and communication costs considering the iterative process of federated learning. We show that GSFL outperforms existing approaches in terms of computation, communication, and signaling costs. Also, we show that the proposed protocol can handle various security attacks in the federated learning environment.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源