论文标题
稳定的神经流
Stable Neural Flows
论文作者
论文摘要
我们引入了一种可证明的神经普通微分方程(神经ODE)的稳定变体,其轨迹在神经网络的能量功能上演变而成。稳定的神经流对深度流的渐近稳定性提供了隐式保证,从而导致对输入扰动的鲁棒性和数值求解器的计算负担低。学习过程是作为最佳控制问题施放的,并基于伴随的感应性分析提出了近似解决方案。我们进一步介绍了旨在简化优化过程并加快收敛速度的新型正则化剂。在非线性分类和功能近似任务上评估了所提出的模型类。
We introduce a provably stable variant of neural ordinary differential equations (neural ODEs) whose trajectories evolve on an energy functional parametrised by a neural network. Stable neural flows provide an implicit guarantee on asymptotic stability of the depth-flows, leading to robustness against input perturbations and low computational burden for the numerical solver. The learning procedure is cast as an optimal control problem, and an approximate solution is proposed based on adjoint sensivity analysis. We further introduce novel regularizers designed to ease the optimization process and speed up convergence. The proposed model class is evaluated on non-linear classification and function approximation tasks.