论文标题
高维线性模型中经验贝叶斯后期的差异近似
Variational approximations of empirical Bayes posteriors in high-dimensional linear models
论文作者
论文摘要
在高维度中,先前的尾巴对后验计算和渐近浓度率都有显着影响。为了达到最佳速度,同时保持后验计算相对简单,最近提出了一种经验贝叶斯的方法,其中包含带有数据驱动中心的薄尾共轭先验。尽管共轭先验缓解了一些计算负担,但仍需要马尔可夫链蒙特卡洛方法,当尺寸较高时,这可能很昂贵。在本文中,我们在经验贝叶斯后部开发了一个差异近似,该近似是快速计算并保留原始原件的最佳浓度速率。在模拟中,与在各种高维度范围内的文献中现有的变异近似相比,我们的方法具有较高的性能。
In high-dimensions, the prior tails can have a significant effect on both posterior computation and asymptotic concentration rates. To achieve optimal rates while keeping the posterior computations relatively simple, an empirical Bayes approach has recently been proposed, featuring thin-tailed conjugate priors with data-driven centers. While conjugate priors ease some of the computational burden, Markov chain Monte Carlo methods are still needed, which can be expensive when dimension is high. In this paper, we develop a variational approximation to the empirical Bayes posterior that is fast to compute and retains the optimal concentration rate properties of the original. In simulations, our method is shown to have superior performance compared to existing variational approximations in the literature across a wide range of high-dimensional settings.