Dynamic Sparse Causal-Attention Temporal Networks for Interpretable Causality Discovery in Multivariate Time Series
By: Meriem Zerkouk, Miloud Mihoubi, Belkacem Chikhaoui
Potential Business Impact:
Finds hidden causes in changing information.
Understanding causal relationships in multivariate time series (MTS) is essential for effective decision-making in fields such as finance and marketing, where complex dependencies and lagged effects challenge conventional analytical approaches. We introduce Dynamic Sparse Causal-Attention Temporal Networks for Interpretable Causality Discovery in MTS (DyCAST-Net), a novel architecture designed to enhance causal discovery by integrating dilated temporal convolutions and dynamic sparse attention mechanisms. DyCAST-Net effectively captures multiscale temporal dependencies through dilated convolutions while leveraging an adaptive thresholding strategy in its attention mechanism to eliminate spurious connections, ensuring both accuracy and interpretability. A statistical shuffle test validation further strengthens robustness by filtering false positives and improving causal inference reliability. Extensive evaluations on financial and marketing datasets demonstrate that DyCAST-Net consistently outperforms existing models such as TCDF, GCFormer, and CausalFormer. The model provides a more precise estimation of causal delays and significantly reduces false discoveries, particularly in noisy environments. Moreover, attention heatmaps offer interpretable insights, uncovering hidden causal patterns such as the mediated effects of advertising on consumer behavior and the influence of macroeconomic indicators on financial markets. Case studies illustrate DyCAST-Net's ability to detect latent mediators and lagged causal factors, making it particularly effective in high-dimensional, dynamic settings. The model's architecture enhanced by RMSNorm stabilization and causal masking ensures scalability and adaptability across diverse application domains
Similar Papers
D-CTNet: A Dual-Branch Channel-Temporal Forecasting Network with Frequency-Domain Correction
Machine Learning (CS)
Predicts future data changes in factories accurately.
Beyond Attention: Learning Spatio-Temporal Dynamics with Emergent Interpretable Topologies
Machine Learning (CS)
Predicts future events better and faster.
Dynamic Attention (DynAttn): Interpretable High-Dimensional Spatio-Temporal Forecasting (with Application to Conflict Fatalities)
Applications
Predicts where and when fighting will happen.