论文标题

使用贝叶斯超级核武器快速有条件的网络压缩

Fast Conditional Network Compression Using Bayesian HyperNetworks

论文作者

Nguyen, Phuoc, Tran, Truyen, Le, Ky, Gupta, Sunil, Rana, Santu, Nguyen, Dang, Nguyen, Trong, Ryan, Shannon, Venkatesh, Svetha

论文摘要

我们引入了有条件的压缩问题,并提出了解决该问题的快速框架。问题是如何将预验证的大神经网络快速压缩到给定目标环境的最佳较小网络中,例如仅涉及类的子集或仅可用计算资源的上下文的上下文。为了解决这个问题,我们提出了一个有效的贝叶斯框架,以将给定的大型网络压缩为量身定制的较小尺寸,以满足每个上下文需求。我们采用超网络来参数化重量的后验分布给定条件输入,并最大程度地减少该贝叶斯神经网络的变异目标。为了进一步降低网络尺寸,我们提出了一个新的输入输出组的稀疏性分解权重,以鼓励产生的权重更多。我们的方法可以快速生成比基线方法明显小得多的压缩网络。

We introduce a conditional compression problem and propose a fast framework for tackling it. The problem is how to quickly compress a pretrained large neural network into optimal smaller networks given target contexts, e.g. a context involving only a subset of classes or a context where only limited compute resource is available. To solve this, we propose an efficient Bayesian framework to compress a given large network into much smaller size tailored to meet each contextual requirement. We employ a hypernetwork to parameterize the posterior distribution of weights given conditional inputs and minimize a variational objective of this Bayesian neural network. To further reduce the network sizes, we propose a new input-output group sparsity factorization of weights to encourage more sparseness in the generated weights. Our methods can quickly generate compressed networks with significantly smaller sizes than baseline methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源