DB2-TransF: All You Need Is Learnable Daubechies Wavelets for Time Series Forecasting
By: Moulik Gupta, Achyut Mani Tripathi
Potential Business Impact:
Predicts future events faster and with less computer power.
Time series forecasting requires models that can efficiently capture complex temporal dependencies, especially in large-scale and high-dimensional settings. While Transformer-based architectures excel at modeling long-range dependencies, their quadratic computational complexity poses limitations on scalability and adaptability. To overcome these challenges, we introduce DB2-TransF, a novel Transformer-inspired architecture that replaces the self-attention mechanism with a learnable Daubechies wavelet coefficient layer. This wavelet-based module efficiently captures multi-scale local and global patterns and enhances the modeling of correlations across multiple time series for the time series forecasting task. Extensive experiments on 13 standard forecasting benchmarks demonstrate that DB2-TransF achieves comparable or superior predictive accuracy to conventional Transformers, while substantially reducing memory usage for the time series forecasting task. The obtained experimental results position DB2-TransF as a scalable and resource-efficient framework for advanced time series forecasting. Our code is available at https://github.com/SteadySurfdom/DB2-TransF
Similar Papers
WaveTuner: Comprehensive Wavelet Subband Tuning for Time Series Forecasting
Machine Learning (CS)
Improves predictions by analyzing all time-series details.
Wavelet Mixture of Experts for Time Series Forecasting
Machine Learning (CS)
Predicts future events more accurately with less data.
WaveletDiff: Multilevel Wavelet Diffusion For Time Series Generation
Machine Learning (CS)
Creates realistic fake time data for training.