论文标题

用于图像加密和解密的浅层编码器深度解码器(SEDD)网络

Shallow Encoder Deep Decoder (SEDD) Networks for Image Encryption and Decryption

论文作者

Gupta, Chirag

论文摘要

本文探讨了使用简单的浅编码器神经网络E加密的新框架,用于有损图像加密和解密,以及一个复杂的深层解码器神经网络D进行解密。 E保持简单,以便可以在低功率和便携式设备上进行编码,并且原则上可以是输出编码向量的任何非线性函数。 D is trained to decode the encodings using the dataset of image - encoded vector pairs obtained from E and happens independently of E. As the encodings come from E which while being a simple neural network, still has thousands of random parameters and therefore the encodings would be practically impossible to crack without D. This approach differs from autoencoders as D is trained completely independently of E, although the structure may seem similar.因此,本文还从经验上探讨了深层神经网络是否可以学会以任何有用形式重建原始数据,因为神经网络的输出或任何其他非线性功能,这些数据在隐ransisy中可能具有非常有用的应用程序。实验通过对D的定性和定量评估D和一些局限性来证明该框架的潜力。

This paper explores a new framework for lossy image encryption and decryption using a simple shallow encoder neural network E for encryption, and a complex deep decoder neural network D for decryption. E is kept simple so that encoding can be done on low power and portable devices and can in principle be any nonlinear function which outputs an encoded vector. D is trained to decode the encodings using the dataset of image - encoded vector pairs obtained from E and happens independently of E. As the encodings come from E which while being a simple neural network, still has thousands of random parameters and therefore the encodings would be practically impossible to crack without D. This approach differs from autoencoders as D is trained completely independently of E, although the structure may seem similar. Therefore, this paper also explores empirically if a deep neural network can learn to reconstruct the original data in any useful form given the output of a neural network or any other nonlinear function, which can have very useful applications in Cryptanalysis. Experiments demonstrate the potential of the framework through qualitative and quantitative evaluation of the decoded images from D along with some limitations.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源