论文标题
阿里安(Ariann):通过功能秘密共享的低相互作用隐私学习深度学习
ARIANN: Low-Interaction Privacy-Preserving Deep Learning via Function Secret Sharing
论文作者
论文摘要
我们提出了Ariann,这是一种用于私人神经网络培训和敏感数据推断的低相互作用隐私框架。我们的半honest 2派对计算协议(带有值得信赖的经销商)利用功能秘密共享,这是一种最新的轻质加密协议,使我们能够实现有效的在线阶段。我们为Relu,Maxpool和BatchNorm等神经网络的构建块设计了优化的原始图。例如,我们对Relu操作进行私人比较,并在线阶段的输入大小的单个消息,并且预处理键接近4倍的键比以前的工作小。最后,我们提出了一项扩展,以支持Nparty私人联盟学习。我们将框架作为可扩展系统的框架在Pytorch之上,利用CPU和GPU硬件加速度进行加密和机器学习操作。我们评估了我们的端到端系统,用于在标准神经网络(例如Alexnet,VGG16或Resnet18)上的遥远服务器之间的私人推理,以及对LENET等较小网络的私人培训。我们表明,计算而不是通信是主要的瓶颈,并且使用GPU一起使用降低的密钥大小是克服这一障碍的有希望的解决方案。
We propose AriaNN, a low-interaction privacy-preserving framework for private neural network training and inference on sensitive data. Our semi-honest 2-party computation protocol (with a trusted dealer) leverages function secret sharing, a recent lightweight cryptographic protocol that allows us to achieve an efficient online phase. We design optimized primitives for the building blocks of neural networks such as ReLU, MaxPool and BatchNorm. For instance, we perform private comparison for ReLU operations with a single message of the size of the input during the online phase, and with preprocessing keys close to 4X smaller than previous work. Last, we propose an extension to support n-party private federated learning. We implement our framework as an extensible system on top of PyTorch that leverages CPU and GPU hardware acceleration for cryptographic and machine learning operations. We evaluate our end-to-end system for private inference between distant servers on standard neural networks such as AlexNet, VGG16 or ResNet18, and for private training on smaller networks like LeNet. We show that computation rather than communication is the main bottleneck and that using GPUs together with reduced key size is a promising solution to overcome this barrier.