论文标题
随机初始初始化的深神经网络的定量高斯近似
Quantitative Gaussian Approximation of Randomly Initialized Deep Neural Networks
论文作者
论文摘要
鉴于任何深连接的神经网络,用随机高斯参数初始化,我们从上方的二次瓦斯汀距离上束缚其输出分布与合适的高斯过程之间的距离。我们的明确不等式表明,隐藏和输出层的大小如何影响网络的高斯行为,并定量恢复分布收敛导致较宽的限制,即,如果所有隐藏的层大小都变得较大。
Given any deep fully connected neural network, initialized with random Gaussian parameters, we bound from above the quadratic Wasserstein distance between its output distribution and a suitable Gaussian process. Our explicit inequalities indicate how the hidden and output layers sizes affect the Gaussian behaviour of the network and quantitatively recover the distributional convergence results in the wide limit, i.e., if all the hidden layers sizes become large.