论文标题
主:在神经粗糙的方程式中对数符号的较低维度嵌入
LORD: Lower-Dimensional Embedding of Log-Signature in Neural Rough Differential Equations
论文作者
论文摘要
处理非常长的序列数据(例如,长度超过10,000)的问题是机器学习的长期研究问题。最近,已经提出了一个称为神经粗糙差分方程(NRDE)的突破,并表明它能够处理此类数据。他们的主要概念是使用对数 - 签名变换,该变换比不规则的长时间序列相比,它比傅立叶变换更有效,将非常长的时间序列样本转换为一系列相对较短的特征向量。但是,对数符号变换会导致非平凡的空间开销。为此,我们介绍了对数符号(Lord)的低维嵌入方法的方法,在该方法中,我们定义了基于NRDE的自动编码器,以将较高深度的对数符号知识植入较低的对数符号中。我们表明,编码器成功地结合了更高的深度和较低的对数签名知识,这大大稳定了训练过程并提高了模型的准确性。在我们对基准数据集的实验中,根据各种分类和预测评估指标,我们方法的改进比率高达75 \%。
The problem of processing very long time-series data (e.g., a length of more than 10,000) is a long-standing research problem in machine learning. Recently, one breakthrough, called neural rough differential equations (NRDEs), has been proposed and has shown that it is able to process such data. Their main concept is to use the log-signature transform, which is known to be more efficient than the Fourier transform for irregular long time-series, to convert a very long time-series sample into a relatively shorter series of feature vectors. However, the log-signature transform causes non-trivial spatial overheads. To this end, we present the method of LOweR-Dimensional embedding of log-signature (LORD), where we define an NRDE-based autoencoder to implant the higher-depth log-signature knowledge into the lower-depth log-signature. We show that the encoder successfully combines the higher-depth and the lower-depth log-signature knowledge, which greatly stabilizes the training process and increases the model accuracy. In our experiments with benchmark datasets, the improvement ratio by our method is up to 75\% in terms of various classification and forecasting evaluation metrics.