ms-Mamba: Multi-scale Mamba for Time-Series Forecasting
By: Yusuf Meric Karadag, Sinan Kalkan, Ipek Gursel Dino
Potential Business Impact:
Predicts future events better by looking at different time speeds.
The problem of Time-series Forecasting is generally addressed by recurrent, Transformer-based and the recently proposed Mamba-based architectures. However, existing architectures generally process their input at a single temporal scale, which may be sub-optimal for many tasks where information changes over multiple time scales. In this paper, we introduce a novel architecture called Multi-scale Mamba (ms-Mamba) to address this gap. ms-Mamba incorporates multiple temporal scales by using multiple Mamba blocks with different sampling rates ($\Delta$s). Our experiments on many benchmarks demonstrate that ms-Mamba outperforms state-of-the-art approaches, including the recently proposed Transformer-based and Mamba-based models.
Similar Papers
M2Rec: Multi-scale Mamba for Efficient Sequential Recommendation
Information Retrieval
Finds what you'll like next, faster and smarter.
MS-Temba: Multi-Scale Temporal Mamba for Understanding Long Untrimmed Videos
CV and Pattern Recognition
Helps computers understand actions in long videos.
STM3: Mixture of Multiscale Mamba for Long-Term Spatio-Temporal Time-Series Prediction
Machine Learning (CS)
Predicts future events by seeing patterns in time.