English
 
Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Journal Article

Caformer: Rethinking Time-Series Forecasting From Causal Perspective

Authors

Zhang,  Kexuan
External Organizations;

Zou,  Xiaobei
External Organizations;

Yen,  Gary G.
External Organizations;

Tang,  Yang
External Organizations;

/persons/resource/Juergen.Kurths

Kurths,  Jürgen
Potsdam Institute for Climate Impact Research;

External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Zhang, K., Zou, X., Yen, G. G., Tang, Y., Kurths, J. (2025 online): Caformer: Rethinking Time-Series Forecasting From Causal Perspective. - IEEE Transactions on Cybernetics.
https://doi.org/10.1109/TCYB.2025.3631588


Cite as: https://publications.pik-potsdam.de/pubman/item/item_33632
Abstract
Time-series forecasting is considered a critical task with extensive applications across diverse domains. However, effectively capturing both cross-dimension and cross-time dependencies in nonstationary time series remains a significant challenge, particularly due to the confounding effects of environmental factors. These factors often introduce spurious correlations that obscure the learning of meaningful temporal features. In this article, the novel framework Caformer (Causal Transformer) is proposed for time-series forecasting grounded in causal reasoning, the science of identifying causality. The framework consists of four key modules: the dynamic learner, environment learner, temporal learner, and decompose learner. The dynamic learner uncovers dynamic interactions among features to model cross-dimension dependencies, while the temporal learner infers cross-time dependencies under causal constraints. The environment learner, together with the decompose learner, extracts environmental factors and applies a backdoor adjustment to mitigate the confounding effects on the time series. Extensive experiments demonstrate that Caformer achieves the state-of-the-art performance in both long-term and short-term forecasting. It achieves up to a 26.2% mean square error (mse) reduction on the traffic dataset and 21.8% on the electricity dataset compared to PatchTST, with consistent gains across eight long-term benchmarks. On the M4 dataset, Caformer ranks first across all 15 short-term forecasting categories. In addition to strong predictive accuracy, Caformer provides interpretable insights into the learned dependencies.