论文标题

网络调整:频道搜索以FLOP的利用率为指导

Network Adjustment: Channel Search Guided by FLOPs Utilization Ratio

论文作者

Chen, Zhengsu, Niu, Jianwei, Xie, Lingxi, Liu, Xuefeng, Wei, Longhui, Tian, Qi

论文摘要

近年来,自动设计计算高效的神经网络引起了很多关注。现有方法要么利用网络修剪或利用网络体系结构搜索方法。本文提出了一个名为网络调整的新框架,该框架将网络的精度视为flops的函数,因此,在每种网络配置下,可以估算每个层的Flops利用率(Fur),并使用它来确定是否增加或减少层上的通道数。请注意,像非线性函数的梯度一样,皮草仅在当前网络的一个小社区中都是准确的。因此,我们设计了一种迭代机制,以便初始网络经历多个步骤,每个步骤都有一个小的“调整速率”来控制网络的更改。整个搜索过程的计算开销是合理的,即与从头开始重新训练最终模型的计算开销。对标准图像分类数据集和广泛的基本网络进行的实验证明了我们方法的有效性,这始终超过了修剪对应物。该代码可在https://github.com/danczs/networkadjustment上找到。

Automatic designing computationally efficient neural networks has received much attention in recent years. Existing approaches either utilize network pruning or leverage the network architecture search methods. This paper presents a new framework named network adjustment, which considers network accuracy as a function of FLOPs, so that under each network configuration, one can estimate the FLOPs utilization ratio (FUR) for each layer and use it to determine whether to increase or decrease the number of channels on the layer. Note that FUR, like the gradient of a non-linear function, is accurate only in a small neighborhood of the current network. Hence, we design an iterative mechanism so that the initial network undergoes a number of steps, each of which has a small `adjusting rate' to control the changes to the network. The computational overhead of the entire search process is reasonable, i.e., comparable to that of re-training the final model from scratch. Experiments on standard image classification datasets and a wide range of base networks demonstrate the effectiveness of our approach, which consistently outperforms the pruning counterpart. The code is available at https://github.com/danczs/NetworkAdjustment.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源