CPformer -- Concept and Physics enhanced Transformer for Time Series Forecasting
By: Hongwei Ma, Junbin Gao, Minh-Ngoc Tran
Potential Business Impact:
Predicts future events more accurately using science.
Accurate, explainable and physically-credible forecasting remains a persistent challenge for multivariate time-series whose statistical properties vary across domains. We present CPformer, a Concept- and Physics-enhanced Transformer that channels every prediction through five self-supervised, domain-agnostic concepts while enforcing differentiable residuals drawn from first-principle constraints. Unlike prior efficiency-oriented Transformers that rely purely on sparsity or frequency priors , CPformer combines latent transparency with hard scientific guidance while retaining attention for long contexts. We tested CPformer on six publicly-available datasets: sub-hourly Electricity and Traffic, hourly ETT, high-dimensional Weather, weekly Influenza-like Illness, and minute-level Exchange Rate, and CPformer achieves the lowest error in eight of twelve MSE/MAE cells. Relative to the strongest Transformer baseline (FEDformer), CPformer reduces mean-squared-error by 23% on Electricity, 44% on Traffic and 61% on Illness, while matching performance on strictly periodic Weather and ETT series.
Similar Papers
Towards Resilient Transportation: A Conditional Transformer for Accident-Informed Traffic Forecasting
Machine Learning (CS)
Predicts traffic jams better by using accident data.
Fast-Powerformer: A Memory-Efficient Transformer for Accurate Mid-Term Wind Power Forecasting
Machine Learning (CS)
Predicts wind power accurately and fast.
DeepKoopFormer: A Koopman Enhanced Transformer Based Architecture for Time Series Forecasting
Machine Learning (CS)
Predicts future events more accurately and reliably.