论文标题

一种自适应增量梯度方法,并支持非欧盟规范

An Adaptive Incremental Gradient Method With Support for Non-Euclidean Norms

论文作者

Xie, Binghui, Jin, Chenhan, Zhou, Kaiwen, Cheng, James, Meng, Wei

论文摘要

随机方差降低的方法在解决有限和的问题方面表现出强烈的性能。但是,这些方法通常要求用户手动调整步长,这对于某些大规模优化任务来说是耗时甚至不可行的。为了克服这个问题,我们提出并分析了流行的传奇算法的几种新型自适应变体。最终,我们设计了Barzilai-Borwein台阶的变体,该变体是针对增量梯度方法量身定制的,以确保记忆效率和快速收敛。我们在一般环境下建立了其收敛保证,允许在平滑度和复合目标的定义中使用非欧国人规范,这些规范涵盖了机器学习中广泛的应用。我们改进了对传奇的分析,以支持非欧几里得规范,这填补了现有工作的空白。标准数据集的数值实验表明,与现有的降低方法及其自适应变体相比,该算法的竞争性能。

Stochastic variance reduced methods have shown strong performance in solving finite-sum problems. However, these methods usually require the users to manually tune the step-size, which is time-consuming or even infeasible for some large-scale optimization tasks. To overcome the problem, we propose and analyze several novel adaptive variants of the popular SAGA algorithm. Eventually, we design a variant of Barzilai-Borwein step-size which is tailored for the incremental gradient method to ensure memory efficiency and fast convergence. We establish its convergence guarantees under general settings that allow non-Euclidean norms in the definition of smoothness and the composite objectives, which cover a broad range of applications in machine learning. We improve the analysis of SAGA to support non-Euclidean norms, which fills the void of existing work. Numerical experiments on standard datasets demonstrate a competitive performance of the proposed algorithm compared with existing variance-reduced methods and their adaptive variants.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源