论文标题

旋转模型中多层估计的近似消息传递

Approximate Message Passing for Multi-Layer Estimation in Rotationally Invariant Models

论文作者

Xu, Yizhou, Hou, TianQi, Liang, ShanSuo, Mondelli, Marco

论文摘要

我们考虑重建信号的问题和来自来自具有旋转不变重量矩阵的多层网络的观测值的隐藏变量。从深层生成先验的多层结构模型推断,对权重施加的旋转不变性通过允许复杂的相关结构(在应用中典型的)来推广I.I.D. \ Gaussian假设。在这项工作中,我们提出了一类新的近似消息传递(AMP)算法的类别,并给出了状态进化递归,该递归精确地表征了它们在大型系统限制中的性能。与现有的多层vamp(ML-VAMP)方法相反,我们提出的AMP(称为多层旋转不变的广义AMP(ML-RI-GAMP))提供了超出高斯设计以外的自然概括,因为它可以恢复现有的高斯AMP作为特殊情况。此外,ML-RI-GAMP的复杂性明显低于ML-VAMP,因为计算密集的奇异值分解被估计设计矩阵的矩替代。最后,我们的数值结果表明,这种复杂性增益在算法的性能中几乎没有成本。

We consider the problem of reconstructing the signal and the hidden variables from observations coming from a multi-layer network with rotationally invariant weight matrices. The multi-layer structure models inference from deep generative priors, and the rotational invariance imposed on the weights generalizes the i.i.d.\ Gaussian assumption by allowing for a complex correlation structure, which is typical in applications. In this work, we present a new class of approximate message passing (AMP) algorithms and give a state evolution recursion which precisely characterizes their performance in the large system limit. In contrast with the existing multi-layer VAMP (ML-VAMP) approach, our proposed AMP -- dubbed multi-layer rotationally invariant generalized AMP (ML-RI-GAMP) -- provides a natural generalization beyond Gaussian designs, in the sense that it recovers the existing Gaussian AMP as a special case. Furthermore, ML-RI-GAMP exhibits a significantly lower complexity than ML-VAMP, as the computationally intensive singular value decomposition is replaced by an estimation of the moments of the design matrices. Finally, our numerical results show that this complexity gain comes at little to no cost in the performance of the algorithm.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源