论文标题
通过共享参数增长神经网络
Growing Neural Network with Shared Parameter
论文作者
论文摘要
我们提出了一种通过将受过训练的网络与新输入相匹配的一般方法,该方法将具有共享参数的神经网络与共享参数。通过利用Hoeffding的不平等,我们通过将子网添加到现有网络中提供了一个理论基础,以提高性能。借助添加新子网的理论基础,我们实现了一种匹配方法,将现有网络的训练有素的子网应用于新输入。我们的方法表明能够以较高的参数效率提高性能。它也可以应用于trans任务案例,并通过在不培训新任务的情况下更改子网的组合来实现转移学习。
We propose a general method for growing neural network with shared parameter by matching trained network to new input. By leveraging Hoeffding's inequality, we provide a theoretical base for improving performance by adding subnetwork to existing network. With the theoretical base of adding new subnetwork, we implement a matching method to apply trained subnetwork of existing network to new input. Our method has shown the ability to improve performance with higher parameter efficiency. It can also be applied to trans-task case and realize transfer learning by changing the combination of subnetworks without training on new task.