论文标题
精益:通过提取最长链条的卷积神经网络基于图的修剪
LEAN: graph-based pruning for convolutional neural networks by extracting longest chains
论文作者
论文摘要
神经网络修剪技术可以大大降低应用卷积神经网络(CNN)的计算成本。常见的修剪方法确定了通过单独对过滤器进行排名,即不考虑其相互依赖性来消除的卷积过滤器。在本文中,我们提倡的观点是,修剪应该考虑一系列连续操作员之间的相互依赖性。我们提出了最长的链(精益)方法,该方法通过使用基于图的算法选择相关的卷积链来修剪CNN。 CNN被解释为图形,每个操作员的操作员标准为边缘的距离度量。精益修剪迭代从图表中提取最高的值路径。在我们的实验中,我们测试了几个图像到图像任务的精益修剪,包括著名的Camvid数据集和现实世界中的X射线CT数据集。结果表明,精益修剪可以导致具有相似精度的网络,同时使用的卷积过滤器要比现有方法少1.7-12x。
Neural network pruning techniques can substantially reduce the computational cost of applying convolutional neural networks (CNNs). Common pruning methods determine which convolutional filters to remove by ranking the filters individually, i.e., without taking into account their interdependence. In this paper, we advocate the viewpoint that pruning should consider the interdependence between series of consecutive operators. We propose the LongEst-chAiN (LEAN) method that prunes CNNs by using graph-based algorithms to select relevant chains of convolutions. A CNN is interpreted as a graph, with the operator norm of each operator as distance metric for the edges. LEAN pruning iteratively extracts the highest value path from the graph to keep. In our experiments, we test LEAN pruning on several image-to-image tasks, including the well-known CamVid dataset, and a real-world X-ray CT dataset. Results indicate that LEAN pruning can result in networks with similar accuracy, while using 1.7-12x fewer convolutional filters than existing approaches.