论文标题
通过梯度的不均匀亚采样,提高随机梯度MCMC方法的采样精度
Improving Sampling Accuracy of Stochastic Gradient MCMC Methods via Non-uniform Subsampling of Gradients
论文作者
论文摘要
许多马尔可夫链蒙特卡洛(MCMC)方法利用目标分布的潜在功能的梯度信息有效地探索样本空间。但是,对于大规模应用,例如当代机器学习的应用程序,计算梯度通常在计算上很昂贵。随机梯度(SG-)MCMC方法通常通过均匀的下采样数据点来近似梯度,并提高了计算效率,但是以引入采样误差的价格。我们提出了一个不均匀的亚采样方案,以提高采样精度。提出的指数加权随机梯度(EWSG)的设计是为了使不均匀的SG-MCMC方法模仿批处理梯度-MCMC方法的统计行为,因此由于SG近似而导致的不准确性减少了。 EWSG与经典差异降低(VR)技术不同,因为它着重于整个分布而不仅仅是方差。然而,也证明了其局部差异的减少。 EWSG也可以被视为重要性采样思想的扩展,即基于随机分数的优化成功,成功地采样任务。在我们对EWSG的实际实施中,非均匀的子采样是通过数据指数上的大都市杂货链有效地执行的,该链与MCMC算法耦合。提供了数值实验,不仅是为了证明EWSG的有效性,还可以指导高参数选择,并验证我们的\ emph {non-asymptotic全局误差约束},尽管实现了近似值。值得注意的是,虽然提高了统计精度,但收敛速度可以与统一版本相媲美,该版本使EWSG成为VR的实用替代方案(但是EWSG和VR也可以合并)。
Many Markov Chain Monte Carlo (MCMC) methods leverage gradient information of the potential function of target distribution to explore sample space efficiently. However, computing gradients can often be computationally expensive for large scale applications, such as those in contemporary machine learning. Stochastic Gradient (SG-)MCMC methods approximate gradients by stochastic ones, commonly via uniformly subsampled data points, and achieve improved computational efficiency, however at the price of introducing sampling error. We propose a non-uniform subsampling scheme to improve the sampling accuracy. The proposed exponentially weighted stochastic gradient (EWSG) is designed so that a non-uniform-SG-MCMC method mimics the statistical behavior of a batch-gradient-MCMC method, and hence the inaccuracy due to SG approximation is reduced. EWSG differs from classical variance reduction (VR) techniques as it focuses on the entire distribution instead of just the variance; nevertheless, its reduced local variance is also proved. EWSG can also be viewed as an extension of the importance sampling idea, successful for stochastic-gradient-based optimizations, to sampling tasks. In our practical implementation of EWSG, the non-uniform subsampling is performed efficiently via a Metropolis-Hastings chain on the data index, which is coupled to the MCMC algorithm. Numerical experiments are provided, not only to demonstrate EWSG's effectiveness, but also to guide hyperparameter choices, and validate our \emph{non-asymptotic global error bound} despite of approximations in the implementation. Notably, while statistical accuracy is improved, convergence speed can be comparable to the uniform version, which renders EWSG a practical alternative to VR (but EWSG and VR can be combined too).