TimeFormer: Transformer with Attention Modulation Empowered by Temporal Characteristics for Time Series Forecasting
By: Zhipeng Liu , Peibo Duan , Xuan Tang and more
Potential Business Impact:
Predicts future events better by learning from the past.
Although Transformers excel in natural language processing, their extension to time series forecasting remains challenging due to insufficient consideration of the differences between textual and temporal modalities. In this paper, we develop a novel Transformer architecture designed for time series data, aiming to maximize its representational capacity. We identify two key but often overlooked characteristics of time series: (1) unidirectional influence from the past to the future, and (2) the phenomenon of decaying influence over time. These characteristics are introduced to enhance the attention mechanism of Transformers. We propose TimeFormer, whose core innovation is a self-attention mechanism with two modulation terms (MoSA), designed to capture these temporal priors of time series under the constraints of the Hawkes process and causal masking. Additionally, TimeFormer introduces a framework based on multi-scale and subsequence analysis to capture semantic dependencies at different temporal scales, enriching the temporal dependencies. Extensive experiments conducted on multiple real-world datasets show that TimeFormer significantly outperforms state-of-the-art methods, achieving up to a 7.45% reduction in MSE compared to the best baseline and setting new benchmarks on 94.04\% of evaluation metrics. Moreover, we demonstrate that the MoSA mechanism can be broadly applied to enhance the performance of other Transformer-based models.
Similar Papers
AutoHFormer: Efficient Hierarchical Autoregressive Transformer for Time Series Prediction
Machine Learning (CS)
Predicts future events faster and more accurately.
Minimal Time Series Transformer
Machine Learning (CS)
Predicts future numbers using past patterns.
VARMA-Enhanced Transformer for Time Series Forecasting
Machine Learning (CS)
Predicts future events more accurately by combining old and new methods.