论文标题
不均衡的变分贝叶斯
Unnormalized Variational Bayes
论文作者
论文摘要
我们将经验贝叶斯和变异贝叶斯统一,以近似非均衡密度。该框架被称为不当变化贝叶斯(UVB),是基于为随机变量$ y = x+n(0,σ^2 i_d)$制定潜在的可变模型,并使用证据下限(ELBO),由$ y $ y $ y $ y $ y prime the primation估计的差异自动化量的证据(ELBO)计算出来,以$ y $ x $ x $ x $ x估算。在这个有趣的设置中,Elbo的$ \ textit {渐变} $相对于嘈杂的输入,在学习能量函数方面起着核心作用。从经验上讲,我们证明了UVB比在神经经验贝叶斯(DEEN)中使用的MLP的参数化具有更高的近似能量功能的能力。我们特别展示了$σ= 1 $,其中UVB和Deen之间的差异在Denoising实验中变得可见和定性。对于这种高度的噪音,$ y $的分布非常平滑,我们证明一个人可以通过$ - $ restart $ - $ - $ $ restart $ - $ $ $ $ $ $ $ y的$ $ - $ - $ $ - $ $ $ - $ $ $ $ - $ $ $ $ $ $ - $ $ $ $ $ $ $。我们通过探测训练有素的型号的编码器/解码器来结束并确认UVB $ \ neq $ vae。
We unify empirical Bayes and variational Bayes for approximating unnormalized densities. This framework, named unnormalized variational Bayes (UVB), is based on formulating a latent variable model for the random variable $Y=X+N(0,σ^2 I_d)$ and using the evidence lower bound (ELBO), computed by a variational autoencoder, as a parametrization of the energy function of $Y$ which is then used to estimate $X$ with the empirical Bayes least-squares estimator. In this intriguing setup, the $\textit{gradient}$ of the ELBO with respect to noisy inputs plays the central role in learning the energy function. Empirically, we demonstrate that UVB has a higher capacity to approximate energy functions than the parametrization with MLPs as done in neural empirical Bayes (DEEN). We especially showcase $σ=1$, where the differences between UVB and DEEN become visible and qualitative in the denoising experiments. For this high level of noise, the distribution of $Y$ is very smoothed and we demonstrate that one can traverse in a single run $-$ without a restart $-$ all MNIST classes in a variety of styles via walk-jump sampling with a fast-mixing Langevin MCMC sampler. We finish by probing the encoder/decoder of the trained models and confirm UVB $\neq$ VAE.