Quantum-Optimized Selective State Space Model for Efficient Time Series Prediction
By: Stefan-Alexandru Jura, Mihai Udrescu, Alexandru Topirceanu
Potential Business Impact:
Predicts future events more accurately over long times.
Long-range time series forecasting remains challenging, as it requires capturing non-stationary and multi-scale temporal dependencies while maintaining noise robustness, efficiency, and stability. Transformer-based architectures such as Autoformer and Informer improve generalization but suffer from quadratic complexity and degraded performance on very long time horizons. State space models, notably S-Mamba, provide linear-time updates but often face unstable training dynamics, sensitivity to initialization, and limited robustness for multivariate forecasting. To address such challenges, we propose the Quantum-Optimized Selective State Space Model (Q-SSM), a hybrid quantum-optimized approach that integrates state space dynamics with a variational quantum gate. Instead of relying on expensive attention mechanisms, Q-SSM employs a simple parametrized quantum circuit (RY-RX ansatz) whose expectation values regulate memory updates adaptively. This quantum gating mechanism improves convergence stability, enhances the modeling of long-term dependencies, and provides a lightweight alternative to attention. We empirically validate Q-SSM on three widely used benchmarks, i.e., ETT, Traffic, and Exchange Rate. Results show that Q-SSM consistently improves over strong baselines (LSTM, TCN, Reformer), Transformer-based models, and S-Mamba. These findings demonstrate that variational quantum gating can address current limitations in long-range forecasting, leading to accurate and robust multivariate predictions.
Similar Papers
Leveraging State Space Models in Long Range Genomics
Genomics
Helps computers understand long DNA codes better.
Rethinking Selectivity in State Space Models: A Minimal Predictive Sufficiency Approach
Machine Learning (CS)
Makes AI better at predicting by focusing on important past info.
PerfMamba: Performance Analysis and Pruning of Selective State Space Models
Machine Learning (CS)
Makes computer models run faster and use less memory.