Abstract:Aiming at the difficulties in multi-dimensional feature modeling, non-stationary data and high prediction accuracy requirements in current time series prediction tasks, a non-stationary learning inverted Transformer model combined with causal convolution is proposed. The model first uses the original functions of the inverted embedding exchange attention mechanism for time series data and feedforward neural network. It employs the attention mechanism to learn the multivariate correlation of time series data and the feedforward neural network to learn the time dependence of the time series. Modeling the time and variables of multi-dimensional time series enhances the generalization ability of the model in terms of time dimension and the relationship between variables. Thus, the interpretability of the model is improved. Then, the sequence stabilization module is used to solve the problem of data non-stationarity to improve the predictability of the model. Finally, the non-stationary learning attention mechanism combined with causal convolution is used to reintroduce the key features and information that vanish in the stabilization module, thereby enhancing the prediction accuracy of the model. Compared with multiple mainstream benchmark models including PatchTST, iTransformer, and Crossformer, the mean square error of the proposed model on four data sets such as Exchange decreases by 6.2% to 65.0% on average. Ablation experiments show that the inverted embedding module and the non-stationary learning attention module combined with causal convolution can effectively improve the accuracy of time series prediction.