hide
Free keywords:
-
Abstract:
Time-series forecasting is considered a critical task with extensive applications across diverse domains. However, effectively capturing both cross-dimension and cross-time dependencies in nonstationary time series remains a significant challenge, particularly due to the confounding effects of environmental factors. These factors often introduce spurious correlations that obscure the learning of meaningful temporal features. In this article, the novel framework Caformer (Causal Transformer) is proposed for time-series forecasting grounded in causal reasoning, the science of identifying causality. The framework consists of four key modules: the dynamic learner, environment learner, temporal learner, and decompose learner. The dynamic learner uncovers dynamic interactions among features to model cross-dimension dependencies, while the temporal learner infers cross-time dependencies under causal constraints. The environment learner, together with the decompose learner, extracts environmental factors and applies a backdoor adjustment to mitigate the confounding effects on the time series. Extensive experiments demonstrate that Caformer achieves the state-of-the-art performance in both long-term and short-term forecasting. It achieves up to a 26.2% mean square error (mse) reduction on the traffic dataset and 21.8% on the electricity dataset compared to PatchTST, with consistent gains across eight long-term benchmarks. On the M4 dataset, Caformer ranks first across all 15 short-term forecasting categories. In addition to strong predictive accuracy, Caformer provides interpretable insights into the learned dependencies.