论文标题
了解稀疏对神经网络的影响
Understanding the effect of sparsity on neural networks robustness
论文作者
论文摘要
本文探讨了静态稀疏对训练有素网络对权重扰动,数据腐败和对抗性示例的鲁棒性的影响。我们表明,通过增加网络宽度和深度,同时保持网络容量固定,稀疏网络始终匹配,并且通常优于其最初密集的版本,从而达到了一定的稀疏性。由于网络层之间的连通性松动而导致非常高的稀疏性同时下降。我们的发现表明,文献中观察到的网络压缩引起的快速鲁棒性下降是由于网络容量降低而不是稀疏性。
This paper examines the impact of static sparsity on the robustness of a trained network to weight perturbations, data corruption, and adversarial examples. We show that, up to a certain sparsity achieved by increasing network width and depth while keeping the network capacity fixed, sparsified networks consistently match and often outperform their initially dense versions. Robustness and accuracy decline simultaneously for very high sparsity due to loose connectivity between network layers. Our findings show that a rapid robustness drop caused by network compression observed in the literature is due to a reduced network capacity rather than sparsity.