论文标题

具有可证明的固定时间收敛和快速逃避非脱位鞍点的广义梯度流动

Generalized Gradient Flows with Provable Fixed-Time Convergence and Fast Evasion of Non-Degenerate Saddle Points

论文作者

Baranwal, Mayank, Budhraja, Param, Raj, Vishal, Hota, Ashish R.

论文摘要

基于梯度的一阶优化算法在包括机器学习任务在内的各种域中都可以广泛适用。在连续时间动力学系统的固定时间稳定性理论方面的最新进展中,我们引入了一个广义框架,用于设计具有最强收敛性的加速优化算法,可以进一步扩展到非凸函数的子类。特别是,我们介绍了GenFlow算法及其动量变体,该算法可证明在固定时间内满足Polyak-lojasiewicz(PL)不平等的目标函数的最佳解决方案。此外,对于接纳非分类鞍点的函数,我们表明,对于所提出的GenFlow算法,逃避这些鞍点所需的时间在所有初始条件下都均匀地界定。最后,对于强烈凸出的最佳解决方案是鞍点的强烈凸出的最小问题,类似的方案被证明可以在固定时间内再次到达最佳解决方案。我们的算法的上收敛性能在各种基准数据集上通过实验验证。

Gradient-based first-order convex optimization algorithms find widespread applicability in a variety of domains, including machine learning tasks. Motivated by the recent advances in fixed-time stability theory of continuous-time dynamical systems, we introduce a generalized framework for designing accelerated optimization algorithms with strongest convergence guarantees that further extend to a subclass of non-convex functions. In particular, we introduce the GenFlow algorithm and its momentum variant that provably converge to the optimal solution of objective functions satisfying the Polyak-Łojasiewicz (PL) inequality in a fixed time. Moreover, for functions that admit non-degenerate saddle-points, we show that for the proposed GenFlow algorithm, the time required to evade these saddle-points is uniformly bounded for all initial conditions. Finally, for strongly convex-strongly concave minimax problems whose optimal solution is a saddle point, a similar scheme is shown to arrive at the optimal solution again in a fixed time. The superior convergence properties of our algorithm are validated experimentally on a variety of benchmark datasets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源