论文标题

最佳梯度滑动及其在相似性下应用于分布式优化的应用

Optimal Gradient Sliding and its Application to Distributed Optimization Under Similarity

论文作者

Kovalev, Dmitry, Beznosikov, Aleksandr, Borodich, Ekaterina, Gasnikov, Alexander, Scutari, Gesualdo

论文摘要

我们研究结构化的凸优化问题,具有添加剂目标$ r:= p + q $,其中$ r $ is($μ$ - strongly)凸,$ q $ as $ l_q $ -smooth and convex和$ p $ is $ l_p $ -smmooth,可能是nonconvex。对于此类问题,我们提出了一种不精确的加速梯度滑动方法,该方法可以跳过这些组件之一的梯度计算,同时仍然可以实现$ p $和$ q $的梯度调用的最佳复杂性,也就是说 $ \ MATHCAL {O}(\ sqrt {l_p/μ})$和$ \ Mathcal {O}(\ sqrt {l_q/μ})$。该结果比经典的黑盒复杂度$ \ Mathcal {O}(\ sqrt {(L_p+l_q)/μ})$更清晰,尤其是当$ L_Q $和$ L_Q $之间的差异很大时。然后,由于统计数据的相似性或其他方式,我们将提出的方法应用提出的方法在代理的功能相似性下,在代理的功能相似性下解决分布式优化问题。分布式算法首次在{\ it}通信和局部梯度调用上首次实现较低的复杂性界限,而前者是一个长期存在的开放问题。最后,该方法通过求解一类变异不等式,达到较低的通信和计算复杂性界限,将方法扩展到分布式的马鞍问题(在功能相似性下)。

We study structured convex optimization problems, with additive objective $r:=p + q$, where $r$ is ($μ$-strongly) convex, $q$ is $L_q$-smooth and convex, and $p$ is $L_p$-smooth, possibly nonconvex. For such a class of problems, we proposed an inexact accelerated gradient sliding method that can skip the gradient computation for one of these components while still achieving optimal complexity of gradient calls of $p$ and $q$, that is, $\mathcal{O}(\sqrt{L_p/μ})$ and $\mathcal{O}(\sqrt{L_q/μ})$, respectively. This result is much sharper than the classic black-box complexity $\mathcal{O}(\sqrt{(L_p+L_q)/μ})$, especially when the difference between $L_q$ and $L_q$ is large. We then apply the proposed method to solve distributed optimization problems over master-worker architectures, under agents' function similarity, due to statistical data similarity or otherwise. The distributed algorithm achieves for the first time lower complexity bounds on {\it both} communication and local gradient calls, with the former having being a long-standing open problem. Finally the method is extended to distributed saddle-problems (under function similarity) by means of solving a class of variational inequalities, achieving lower communication and computation complexity bounds.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源