论文标题
不断增长的人工神经网络
Growing Artificial Neural Networks
论文作者
论文摘要
修剪是一种合法的方法,用于减少神经网络的大小以适合低交换硬件,但必须越线训练和修剪网络。我们提出了一种人工神经发生(ANG)算法,该算法生长而不是修剪网络,并使神经网络能够在低掉期嵌入式硬件中训练和执行。 Ang通过使用训练数据来确定实际训练发生之前的关键连接来实现这一目标。我们的实验使用修改后的LENET-5作为基线神经网络,使用总共61,160个重量达到98.74%的测试精度。 ANG成长的网络仅获得21,211个重量的测试精度为98.80%。
Pruning is a legitimate method for reducing the size of a neural network to fit in low SWaP hardware, but the networks must be trained and pruned offline. We propose an algorithm, Artificial Neurogenesis (ANG), that grows rather than prunes the network and enables neural networks to be trained and executed in low SWaP embedded hardware. ANG accomplishes this by using the training data to determine critical connections between layers before the actual training takes place. Our experiments use a modified LeNet-5 as a baseline neural network that achieves a test accuracy of 98.74% using a total of 61,160 weights. An ANG grown network achieves a test accuracy of 98.80% with only 21,211 weights.