论文标题

FastStamp:在FPGA上加速神经隐肌和图像的数字水印

FastStamp: Accelerating Neural Steganography and Digital Watermarking of Images on FPGAs

论文作者

Hussain, Shehzeen, Sheybani, Nojan, Neekhara, Paarth, Zhang, Xinqiao, Duarte, Javier, Koushanfar, Farinaz

论文摘要

隐肌和数字水印是隐藏图像像素中可回收数据的任务。基于深神经网络(DNN)的图像密封造影和水印技术正在迅速取代传统的手工工程管道。基于DNN的水印技术已大大提高了嵌入式水印的消息能力,不可识别性和鲁棒性。但是,这一改进是取决于水印编码器神经网络的计算开销的成本。在这项工作中,我们设计了第一个加速器平台FastStamp,用于执行基于DNN的密封造影和硬件图像的数字水印。我们首先提出了一个参数有效的DNN模型,用于嵌入图像像素中的可回收位串。我们提出的模型可以与先前最新DNN的水印方法的成功指标相匹配,同时在记忆足迹方面更快,更轻。然后,我们设计了一个基于FPGA的加速器框架,以通过利用数据并行性和自定义计算路径来进一步改善模型吞吐量和功耗。 FastStamp允许将硬件签名嵌入图像中,以建立媒体真实性和数字媒体的所有权。与先前基于DNN的水印编码器实施同时消耗更少的功率的GPU实现相比,我们的最佳设计的推断速度更快68倍。

Steganography and digital watermarking are the tasks of hiding recoverable data in image pixels. Deep neural network (DNN) based image steganography and watermarking techniques are quickly replacing traditional hand-engineered pipelines. DNN based watermarking techniques have drastically improved the message capacity, imperceptibility and robustness of the embedded watermarks. However, this improvement comes at the cost of increased computational overhead of the watermark encoder neural network. In this work, we design the first accelerator platform FastStamp to perform DNN based steganography and digital watermarking of images on hardware. We first propose a parameter efficient DNN model for embedding recoverable bit-strings in image pixels. Our proposed model can match the success metrics of prior state-of-the-art DNN based watermarking methods while being significantly faster and lighter in terms of memory footprint. We then design an FPGA based accelerator framework to further improve the model throughput and power consumption by leveraging data parallelism and customized computation paths. FastStamp allows embedding hardware signatures into images to establish media authenticity and ownership of digital media. Our best design achieves 68 times faster inference as compared to GPU implementations of prior DNN based watermark encoder while consuming less power.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源