论文标题

自适应预测,以改善非平稳时间序列的深度学习

Self-Adaptive Forecasting for Improved Deep Learning on Non-Stationary Time-Series

论文作者

Arik, Sercan O., Yoder, Nathanael C., Pfister, Tomas

论文摘要

现实世界中的时间序列数据集经常违反预测的标准监督学习的假设 - 它们的分布会随着时间的流逝而发展,从而使传统的培训和模型选择程序均优化。在本文中,我们提出了一种新颖的方法,即自适应预测(SAF),以修改时间序列预测模型的培训,以通过此类非平稳时间序列数据提高其在预测任务上的性能。 SAF在基于“背景”的预测之前集成了自适应阶段,即在时间后退预测掩盖的输入。这是一种测试时间培训形式,在执行预测任务之前,在测试样本上创建了一个自我监督的学习问题。通过这种方式,我们的方法可以有效地适应编码表示的分布,从而导致卓越的概括。 SAF可以与任何基于经典的编码器码头架构架构(例如经常性神经网络或基于注意力的体系结构)集成。关于众所周知,众所周知的非统计数据(例如医疗保健和金融)的域中的合成和现实数据集,我们在提高预测准确性方面证明了SAF的重大好处。

Real-world time-series datasets often violate the assumptions of standard supervised learning for forecasting -- their distributions evolve over time, rendering the conventional training and model selection procedures suboptimal. In this paper, we propose a novel method, Self-Adaptive Forecasting (SAF), to modify the training of time-series forecasting models to improve their performance on forecasting tasks with such non-stationary time-series data. SAF integrates a self-adaptation stage prior to forecasting based on `backcasting', i.e. predicting masked inputs backward in time. This is a form of test-time training that creates a self-supervised learning problem on test samples before performing the prediction task. In this way, our method enables efficient adaptation of encoded representations to evolving distributions, leading to superior generalization. SAF can be integrated with any canonical encoder-decoder based time-series architecture such as recurrent neural networks or attention-based architectures. On synthetic and real-world datasets in domains where time-series data are known to be notoriously non-stationary, such as healthcare and finance, we demonstrate a significant benefit of SAF in improving forecasting accuracy.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源