论文标题
关于Dirichlet过程的各种后部深层高斯混合模型
On the Variational Posterior of Dirichlet Process Deep Latent Gaussian Mixture Models
论文作者
论文摘要
多亏了重新聚集技巧,深层潜在的高斯模型最近在学习潜在表示方面取得了巨大的成功。然而,将它们与非参与式先验(例如Dirichlet过程(DP))搭配的能力并没有获得类似的成功,因为它的性质是不可参数性的。在本文中,我们介绍了Dirichlet过程的变异后部的替代处理深度潜在的高斯混合模型(DP-DLGMM),在其中我们表明,可以在封闭形式中更新beta分布和群集隐藏变量的先前群集参数和群集的变异后代。这导致了高斯潜在变量的标准重新聚集技巧,知道集群分配。我们在标准基准数据集上演示了我们的方法,我们表明我们的模型能够为获得的每个集群生成现实样本,并在半监督的设置中表现出竞争性能。
Thanks to the reparameterization trick, deep latent Gaussian models have shown tremendous success recently in learning latent representations. The ability to couple them however with nonparamet-ric priors such as the Dirichlet Process (DP) hasn't seen similar success due to its non parameteriz-able nature. In this paper, we present an alternative treatment of the variational posterior of the Dirichlet Process Deep Latent Gaussian Mixture Model (DP-DLGMM), where we show that the prior cluster parameters and the variational posteriors of the beta distributions and cluster hidden variables can be updated in closed-form. This leads to a standard reparameterization trick on the Gaussian latent variables knowing the cluster assignments. We demonstrate our approach on standard benchmark datasets, we show that our model is capable of generating realistic samples for each cluster obtained, and manifests competitive performance in a semi-supervised setting.