QKAN-LSTM: Quantum-inspired Kolmogorov-Arnold Long Short-term Memory
By: Yu-Chao Hsu , Jiun-Cheng Jiang , Chun-Hua Lin and more
Potential Business Impact:
Makes computer predictions much better with fewer parts.
Long short-term memory (LSTM) models are a particular type of recurrent neural networks (RNNs) that are central to sequential modeling tasks in domains such as urban telecommunication forecasting, where temporal correlations and nonlinear dependencies dominate. However, conventional LSTMs suffer from high parameter redundancy and limited nonlinear expressivity. In this work, we propose the Quantum-inspired Kolmogorov-Arnold Long Short-Term Memory (QKAN-LSTM), which integrates Data Re-Uploading Activation (DARUAN) modules into the gating structure of LSTMs. Each DARUAN acts as a quantum variational activation function (QVAF), enhancing frequency adaptability and enabling an exponentially enriched spectral representation without multi-qubit entanglement. The resulting architecture preserves quantum-level expressivity while remaining fully executable on classical hardware. Empirical evaluations on three datasets, Damped Simple Harmonic Motion, Bessel Function, and Urban Telecommunication, demonstrate that QKAN-LSTM achieves superior predictive accuracy and generalization with a 79% reduction in trainable parameters compared to classical LSTMs. We extend the framework to the Jiang-Huang-Chen-Goan Network (JHCG Net), which generalizes KAN to encoder-decoder structures, and then further use QKAN to realize the latent KAN, thereby creating a Hybrid QKAN (HQKAN) for hierarchical representation learning. The proposed HQKAN-LSTM thus provides a scalable and interpretable pathway toward quantum-inspired sequential modeling in real-world data environments.
Similar Papers
KAN vs LSTM Performance in Time Series Forecasting
Machine Learning (CS)
LSTM predicts stock prices much better than KAN.
QuantKAN: A Unified Quantization Framework for Kolmogorov Arnold Networks
Machine Learning (CS)
Makes smart computer brains smaller and faster.
Physics-informed time series analysis with Kolmogorov-Arnold Networks under Ehrenfest constraints
Machine Learning (CS)
Predicts how tiny things move much faster.