论文标题
动态张量分解的非参数因子轨迹学习
Nonparametric Factor Trajectory Learning for Dynamic Tensor Decomposition
论文作者
论文摘要
张量分解是一个基本框架,用于分析可以由多维阵列表示的数据。实际上,张量数据通常伴随时间信息,即生成输入值的时间点。该信息意味着丰富的,复杂的时间变化模式。但是,当前方法始终假设每个张量模式下实体的因子表示是静态的,并且从不考虑它们的时间演变。为了填补这一空白,我们建议用于动态张量分解(NONFAT)的非参数因子轨迹学习。我们将高斯工艺(GP)先验放置在频域中,并通过高斯 - 局部正交进行逆傅立叶变换以采样轨迹函数。这样,我们可以克服数据稀疏性并在长期范围内获得稳健的轨迹估计。考虑到特定时间点处的轨迹值,我们使用二级GP来采样入口值并捕获实体之间的时间关系。为了高效且可扩展的推断,我们利用模型中的基质高斯结构,引入基质高斯后部,并开发嵌套的稀疏变分学习算法。我们已经在几个现实世界应用中展示了我们方法的优势。
Tensor decomposition is a fundamental framework to analyze data that can be represented by multi-dimensional arrays. In practice, tensor data is often accompanied by temporal information, namely the time points when the entry values were generated. This information implies abundant, complex temporal variation patterns. However, current methods always assume the factor representations of the entities in each tensor mode are static, and never consider their temporal evolution. To fill this gap, we propose NONparametric FActor Trajectory learning for dynamic tensor decomposition (NONFAT). We place Gaussian process (GP) priors in the frequency domain and conduct inverse Fourier transform via Gauss-Laguerre quadrature to sample the trajectory functions. In this way, we can overcome data sparsity and obtain robust trajectory estimates across long time horizons. Given the trajectory values at specific time points, we use a second-level GP to sample the entry values and to capture the temporal relationship between the entities. For efficient and scalable inference, we leverage the matrix Gaussian structure in the model, introduce a matrix Gaussian posterior, and develop a nested sparse variational learning algorithm. We have shown the advantage of our method in several real-world applications.