论文标题

使用邻居信息在稀疏的高斯过程中稀疏

Sparse within Sparse Gaussian Processes using Neighbor Information

论文作者

Tran, Gia-Lac, Milios, Dimitrios, Michiardi, Pietro, Filippone, Maurizio

论文摘要

基于诱导变量的高斯过程的近似值,结合变异推理技术,使最先进的稀疏方法通过基于微型批次的学习来规模推断GPS。在这项工作中,我们解决了稀疏GPS的一个局限性,这是由于处理大量诱导变量而不对诱导输入施加特殊结构的挑战。特别是,我们引入了一个新型的层次先验,该先验对诱导变量的集合施加了稀疏性。我们对模型进行了变异的处理,并且当考虑到诱导变量上的稀疏性时,我们在实验上显示出可观的计算收益,考虑到诱导变量的稀疏性考虑到数据随机微型数据的最接近诱导输入。我们进行了广泛的实验验证,该验证证明了与最先进的方法相比,我们的方法的有效性。我们的方法使有可能使用大量诱导点使用稀疏的GP,而不会产生过度的计算成本。

Approximations to Gaussian processes based on inducing variables, combined with variational inference techniques, enable state-of-the-art sparse approaches to infer GPs at scale through mini batch-based learning. In this work, we address one limitation of sparse GPs, which is due to the challenge in dealing with a large number of inducing variables without imposing a special structure on the inducing inputs. In particular, we introduce a novel hierarchical prior, which imposes sparsity on the set of inducing variables. We treat our model variationally, and we experimentally show considerable computational gains compared to standard sparse GPs when sparsity on the inducing variables is realized considering the nearest inducing inputs of a random mini-batch of the data. We perform an extensive experimental validation that demonstrates the effectiveness of our approach compared to the state-of-the-art. Our approach enables the possibility to use sparse GPs using a large number of inducing points without incurring a prohibitive computational cost.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源