结合因果卷积的非平稳学习倒置Transformer的时间序列预测模型
CSTR:
作者:
作者单位:

作者简介:

通讯作者:

中图分类号:

基金项目:

山西省基础研究计划联合资助项目 (太重) (TZLH20230818007)


Time Series Prediction Model Combining Non-stationary Learning Inverted Transformer with Causal Convolution
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    针对当前时间序列预测任务中存在多维特征建模困难、数据非平稳、预测准确性要求高等问题, 提出结合因果卷积的非平稳学习倒置Transformer模型. 该模型首先利用倒置嵌入时间序列数据交换注意力机制和前馈神经网络原有功能, 使用注意力机制学习时间序列数据的多元相关性, 前馈神经网络学习时间序列的时间依赖性, 在多维时间序列时间及变量上建模, 增强模型在时间维度和变量间关系的泛化能力, 从而提高模型的可解释性. 然后, 利用序列平稳化模块解决数据非平稳性问题以提高模型的可预测能力. 最后使用结合因果卷积的非平稳学习注意力机制将平稳化模块中消失的关键特征与信息重新引入, 从而提高模型的预测准确性. 与PatchTST、iTransformer、Crossformer等多个主流基准模型进行比较, 所提模型在Exchange等4个数据集上的均方误差平均下降了6.2%–65.0%. 通过消融实验表明本文的倒置嵌入模块、结合因果卷积的非平稳学习注意力模块能有效提升时间序列预测的准确度.

    Abstract:

    Aiming at the difficulties in multi-dimensional feature modeling, non-stationary data and high prediction accuracy requirements in current time series prediction tasks, a non-stationary learning inverted Transformer model combined with causal convolution is proposed. The model first uses the original functions of the inverted embedding exchange attention mechanism for time series data and feedforward neural network. It employs the attention mechanism to learn the multivariate correlation of time series data and the feedforward neural network to learn the time dependence of the time series. Modeling the time and variables of multi-dimensional time series enhances the generalization ability of the model in terms of time dimension and the relationship between variables. Thus, the interpretability of the model is improved. Then, the sequence stabilization module is used to solve the problem of data non-stationarity to improve the predictability of the model. Finally, the non-stationary learning attention mechanism combined with causal convolution is used to reintroduce the key features and information that vanish in the stabilization module, thereby enhancing the prediction accuracy of the model. Compared with multiple mainstream benchmark models including PatchTST, iTransformer, and Crossformer, the mean square error of the proposed model on four data sets such as Exchange decreases by 6.2% to 65.0% on average. Ablation experiments show that the inverted embedding module and the non-stationary learning attention module combined with causal convolution can effectively improve the accuracy of time series prediction.

    参考文献
    相似文献
    引证文献
引用本文

李子烨,乔钢柱.结合因果卷积的非平稳学习倒置Transformer的时间序列预测模型.计算机系统应用,,():1-11

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2024-09-20
  • 最后修改日期:2024-10-21
  • 录用日期:
  • 在线发布日期: 2025-02-25
  • 出版日期:
文章二维码
您是第位访问者
版权所有:中国科学院软件研究所 京ICP备05046678号-3
地址:北京海淀区中关村南四街4号 中科院软件园区 7号楼305房间,邮政编码:100190
电话:010-62661041 传真: Email:csa (a) iscas.ac.cn
技术支持:北京勤云科技发展有限公司

京公网安备 11040202500063号