论文标题
FECAM:频率增强的通道注意机制预测时间序列
FECAM: Frequency Enhanced Channel Attention Mechanism for Time Series Forecasting
论文作者
论文摘要
时间序列预测是一个长期存在的挑战,因为现实世界中的信息在各种情况下(例如能源,天气,交通,经济学,地震警告)。但是,某些主流预测模型预测结果与地面真理显着脱轨。我们认为,这就是模型缺乏捕获频率信息的能力,而频率信息丰富包含在现实世界数据集中的原因。目前,主流频率信息提取方法基于傅立叶变换(FT)。但是,由于吉布斯现象,使用ft是有问题的。如果序列两侧的值有很大差异,则将在两侧观察到振荡近似值,并引入高频噪声。因此,我们提出了一种新型的频率增强的通道注意力,该通道注意力基于离散的余弦变换对通道之间的频率相互依存的自适应建模,该频率将在傅立叶变换过程中本质上避免由有问题的时期引起的高频噪声,这被定义为Gibbs现象。我们表明,该网络在跨六个现实世界数据集中非常有效地概括并实现最先进的性能,我们进一步证明了频率增强的通道注意机制模块可以灵活地应用于不同的网络。该模块可以提高现有主流网络的预测能力,该网络在LSTM上降低了35.99%的MSE,改革仪的10.01%,Informer的8.71%,AutoFormer的8.29%,变压器等8.06%的计算成本为8.06%,只需几行代码。我们的代码和数据可在https://github.com/zero-coder/fecam上获得。
Time series forecasting is a long-standing challenge due to the real-world information is in various scenario (e.g., energy, weather, traffic, economics, earthquake warning). However some mainstream forecasting model forecasting result is derailed dramatically from ground truth. We believe it's the reason that model's lacking ability of capturing frequency information which richly contains in real world datasets. At present, the mainstream frequency information extraction methods are Fourier transform(FT) based. However, use of FT is problematic due to Gibbs phenomenon. If the values on both sides of sequences differ significantly, oscillatory approximations are observed around both sides and high frequency noise will be introduced. Therefore We propose a novel frequency enhanced channel attention that adaptively modelling frequency interdependencies between channels based on Discrete Cosine Transform which would intrinsically avoid high frequency noise caused by problematic periodity during Fourier Transform, which is defined as Gibbs Phenomenon. We show that this network generalize extremely effectively across six real-world datasets and achieve state-of-the-art performance, we further demonstrate that frequency enhanced channel attention mechanism module can be flexibly applied to different networks. This module can improve the prediction ability of existing mainstream networks, which reduces 35.99% MSE on LSTM, 10.01% on Reformer, 8.71% on Informer, 8.29% on Autoformer, 8.06% on Transformer, etc., at a slight computational cost ,with just a few line of code. Our codes and data are available at https://github.com/Zero-coder/FECAM.