论文标题

理想:不精确的分散加速增强拉格朗日方法

IDEAL: Inexact DEcentralized Accelerated Augmented Lagrangian Method

论文作者

Arjevani, Yossi, Bruna, Joan, Can, Bugra, Gürbüzbalaban, Mert, Jegelka, Stefanie, Lin, Hongzhou

论文摘要

我们引入了一个框架,用于在分散的优化设置下设计原始方法,其中本地功能是平稳且强烈凸的。我们的方法包括近似求解由加速的增强拉格朗日方法引起的一系列子问题,从而提供了一种系统的方法来推导几种众所周知的分散算法,包括额外的ARXIV:1404.6264和SSDA ARXIV:SSDA ARXIV:1702.08704。当与加速的梯度下降结合时,我们的框架产生了一种新型的原始算法,其收敛速率是最佳的,并且与最近得出的下限相匹配。我们提供实验结果,以证明所提出的算法对高度不良问题的有效性。

We introduce a framework for designing primal methods under the decentralized optimization setting where local functions are smooth and strongly convex. Our approach consists of approximately solving a sequence of sub-problems induced by the accelerated augmented Lagrangian method, thereby providing a systematic way for deriving several well-known decentralized algorithms including EXTRA arXiv:1404.6264 and SSDA arXiv:1702.08704. When coupled with accelerated gradient descent, our framework yields a novel primal algorithm whose convergence rate is optimal and matched by recently derived lower bounds. We provide experimental results that demonstrate the effectiveness of the proposed algorithm on highly ill-conditioned problems.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源