论文标题

在喂养前神经网络的衍生物上

On the derivatives of feed-forward neural networks

论文作者

Khalek, Rabah Abdul, Bertone, Valerio

论文摘要

在本文中,我们介绍了馈送前向神经网络的分析导数的C ++实现,相对于其自由参数,用于任意体系结构,称为后传播。我们称此代码为NNAD(神经网络分析衍生物),并将其与广泛使用的CERES-SOLVER MINIMISER相连,以将神经网络拟合到两个不同最小二乘问题中的神经网络。第一个是Legendre多项式的直接拟合。第二个是一个更具涉及的最小化问题,其中要拟合的函数参与积分不可或缺的问题。最后,使用一个一致的框架,我们评估了与Ceres-Solver提供的数值和自动分化相比,分析衍生物公式的效率。因此,我们证明了在涉及深层神经网络或浅层神经网络的问题中使用NNAD的优势。

In this paper we present a C++ implementation of the analytic derivative of a feed-forward neural network with respect to its free parameters for an arbitrary architecture, known as back-propagation. We dubbed this code NNAD (Neural Network Analytic Derivatives) and interfaced it with the widely-used ceres-solver minimiser to fit neural networks to pseudodata in two different least-squares problems. The first is a direct fit of Legendre polynomials. The second is a somewhat more involved minimisation problem where the function to be fitted takes part in an integral. Finally, using a consistent framework, we assess the efficiency of our analytic derivative formula as compared to numerical and automatic differentiation as provided by ceres-solver. We thus demonstrate the advantage of using NNAD in problems involving both deep or shallow neural networks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源