Score: 1

Transforming Causality: Transformer-Based Temporal Causal Discovery with Prior Knowledge Integration

Published: August 21, 2025 | arXiv ID: 2508.15928v1

By: Jihua Huang, Yi Yao, Ajay Divakaran

Potential Business Impact:

Finds true causes in messy time data.

Business Areas:
Predictive Analytics Artificial Intelligence, Data and Analytics, Software

We introduce a novel framework for temporal causal discovery and inference that addresses two key challenges: complex nonlinear dependencies and spurious correlations. Our approach employs a multi-layer Transformer-based time-series forecaster to capture long-range, nonlinear temporal relationships among variables. After training, we extract the underlying causal structure and associated time lags from the forecaster using gradient-based analysis, enabling the construction of a causal graph. To mitigate the impact of spurious causal relationships, we introduce a prior knowledge integration mechanism based on attention masking, which consistently enforces user-excluded causal links across multiple Transformer layers. Extensive experiments show that our method significantly outperforms other state-of-the-art approaches, achieving a 12.8% improvement in F1-score for causal discovery and 98.9% accuracy in estimating causal lags.

Page Count
10 pages

Category
Computer Science:
Machine Learning (CS)