论文标题

通过适配器的内容自适应优化的通用深度图像压缩

Universal Deep Image Compression via Content-Adaptive Optimization with Adapters

论文作者

Tsubota, Koki, Akutsu, Hiroaki, Aizawa, Kiyoharu

论文摘要

在自然图像上,深度图像压缩性能比常规编解码器(例如JPEG)更好。但是,深度图像压缩是基于学习的,并遇到一个问题:压缩性能对于室外图像显着恶化。在这项研究中,我们强调了这个问题并解决了一项新任务:通用深层图像压缩。该任务旨在压缩属于任意域的图像,例如自然图像,线图和漫画。为了解决这个问题,我们提出了一个内容自适应优化框架;该框架使用预训练的压缩模型,并在压缩过程中将模型调整为目标图像。适配器插入模型的解码器中。对于每个输入图像,我们的框架优化了用速率延伸的编码器和适配器参数提取的潜在表示。适配器参数是每个图像的传输。对于实验,构建了包含四个域(自然图像,线图,漫画和矢量艺术)的未压缩图像的基准数据集,并评估了提出的通用深度压缩。最后,将提出的模型与非自适应和现有自适应压缩模型进行了比较。比较表明,所提出的模型优于这些模型。代码和数据集可在https://github.com/kktsubota/universal-dic上公开获得。

Deep image compression performs better than conventional codecs, such as JPEG, on natural images. However, deep image compression is learning-based and encounters a problem: the compression performance deteriorates significantly for out-of-domain images. In this study, we highlight this problem and address a novel task: universal deep image compression. This task aims to compress images belonging to arbitrary domains, such as natural images, line drawings, and comics. To address this problem, we propose a content-adaptive optimization framework; this framework uses a pre-trained compression model and adapts the model to a target image during compression. Adapters are inserted into the decoder of the model. For each input image, our framework optimizes the latent representation extracted by the encoder and the adapter parameters in terms of rate-distortion. The adapter parameters are additionally transmitted per image. For the experiments, a benchmark dataset containing uncompressed images of four domains (natural images, line drawings, comics, and vector arts) is constructed and the proposed universal deep compression is evaluated. Finally, the proposed model is compared with non-adaptive and existing adaptive compression models. The comparison reveals that the proposed model outperforms these. The code and dataset are publicly available at https://github.com/kktsubota/universal-dic.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源