论文标题

凸的加速算法和非凸优化的算法

Accelerated Algorithms for Convex and Non-Convex Optimization on Manifolds

论文作者

Lin, Lizhen, Saparbayeva, Bayan, Zhang, Michael Minyi, Dunson, David B.

论文摘要

我们提出了一个通用方案,用于解决歧管上的凸和非凸优化问题。核心思想是,通过将平方回收距离的倍数添加到所讨论的目标函数中,我们“启用”了目标函数并在优化过程中求解了一系列凸子问题。优化流形的关键挑战之一是难以验证目标函数的复杂性,例如,目标函数是凸或非凸的,以及非凸性的程度。我们提出的算法适应目标函数的复杂性水平。我们表明,当目标函数是凸时,算法可证明会收敛到最佳,并导致加速收敛。当目标函数是非凸件时,算法将收敛到固定点。我们提出的方法统一了Nesterov的最初想法,即加速梯度下降算法,并在欧几里得领域的优化算法方面发展。我们证明了算法在几个流形优化任务上的实用性,例如在球体上估算固有和外在的Fréchet手段,以及将Grassmann歧管应用于Netflix等级数据集的Grassmann歧管。

We propose a general scheme for solving convex and non-convex optimization problems on manifolds. The central idea is that, by adding a multiple of the squared retraction distance to the objective function in question, we "convexify" the objective function and solve a series of convex sub-problems in the optimization procedure. One of the key challenges for optimization on manifolds is the difficulty of verifying the complexity of the objective function, e.g., whether the objective function is convex or non-convex, and the degree of non-convexity. Our proposed algorithm adapts to the level of complexity in the objective function. We show that when the objective function is convex, the algorithm provably converges to the optimum and leads to accelerated convergence. When the objective function is non-convex, the algorithm will converge to a stationary point. Our proposed method unifies insights from Nesterov's original idea for accelerating gradient descent algorithms with recent developments in optimization algorithms in Euclidean space. We demonstrate the utility of our algorithms on several manifold optimization tasks such as estimating intrinsic and extrinsic Fréchet means on spheres and low-rank matrix factorization with Grassmann manifolds applied to the Netflix rating data set.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源