Filter then Attend: Improving attention-based Time Series Forecasting with Spectral Filtering
By: Elisha Dayag, Nhat Thanh Van Tran, Jack Xin
Potential Business Impact:
Makes computer predictions of future events more accurate.
Transformer-based models are at the forefront in long time-series forecasting (LTSF). While in many cases, these models are able to achieve state of the art results, they suffer from a bias toward low-frequencies in the data and high computational and memory requirements. Recent work has established that learnable frequency filters can be an integral part of a deep forecasting model by enhancing the model's spectral utilization. These works choose to use a multilayer perceptron to process their filtered signals and thus do not solve the issues found with transformer-based models. In this paper, we establish that adding a filter to the beginning of transformer-based models enhances their performance in long time-series forecasting. We add learnable filters, which only add an additional $\approx 1000$ parameters to several transformer-based models and observe in multiple instances 5-10 \% relative improvement in forecasting performance. Additionally, we find that with filters added, we are able to decrease the embedding dimension of our models, resulting in transformer-based architectures that are both smaller and more effective than their non-filtering base models. We also conduct synthetic experiments to analyze how the filters enable Transformer-based models to better utilize the full spectrum for forecasting.
Similar Papers
Frequency-Constrained Learning for Long-Term Forecasting
Machine Learning (CS)
Predicts future events by understanding repeating patterns.
FilterTS: Comprehensive Frequency Filtering for Multivariate Time Series Forecasting
Machine Learning (CS)
Predicts future events better by finding hidden patterns.
Selective Learning for Deep Time Series Forecasting
Machine Learning (CS)
Teaches computers to guess future events better.