论文标题

利用多功能图像恢复和操纵的深度生成性先验

Exploiting Deep Generative Prior for Versatile Image Restoration and Manipulation

论文作者

Pan, Xingang, Zhan, Xiaohang, Dai, Bo, Lin, Dahua, Loy, Chen Change, Luo, Ping

论文摘要

先验学习良好的图像是图像恢复和操纵的长期目标。尽管诸如Deep Image Prior(DIP)之类的现有方法捕获了低级图像统计数据,但图像先验仍然存在差距,该图像捕获了丰富的图像语义,包括颜色,空间相干性,纹理和高级概念。这项工作提出了一种有效的方法来利用在大规模自然图像中训练的生成对抗网络(GAN)捕获的图像。如图1所示,深度生成的先验(DGP)为恢复缺失的语义(例如各种降级图像的颜色,斑块,分辨率)提供了令人信服的结果。它还可以实现多种图像操纵,包括随机抖动,图像变形和类别转移。通过放松现有的gan倒入方法的假设,这种恢复和操纵是可能的,这些方法倾向于固定发电机。值得注意的是,我们允许发电机以渐进的方式通过gan中的鉴别器获得的特征距离进行逐步进行微调。我们表明,这些易于实施和实用的更改有助于保留重建,以保持在自然形象的多种状态,从而导致对真实图像的更精确和忠实的重建。代码可在https://github.com/xingangpan/deep-generative-prior上找到。

Learning a good image prior is a long-term goal for image restoration and manipulation. While existing methods like deep image prior (DIP) capture low-level image statistics, there are still gaps toward an image prior that captures rich image semantics including color, spatial coherence, textures, and high-level concepts. This work presents an effective way to exploit the image prior captured by a generative adversarial network (GAN) trained on large-scale natural images. As shown in Fig.1, the deep generative prior (DGP) provides compelling results to restore missing semantics, e.g., color, patch, resolution, of various degraded images. It also enables diverse image manipulation including random jittering, image morphing, and category transfer. Such highly flexible restoration and manipulation are made possible through relaxing the assumption of existing GAN-inversion methods, which tend to fix the generator. Notably, we allow the generator to be fine-tuned on-the-fly in a progressive manner regularized by feature distance obtained by the discriminator in GAN. We show that these easy-to-implement and practical changes help preserve the reconstruction to remain in the manifold of nature image, and thus lead to more precise and faithful reconstruction for real images. Code is available at https://github.com/XingangPan/deep-generative-prior.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源