Batched Training for QLSTM vs. QFWP: A System-Oriented Approach to EPC-Aware RMSE-DA
By: Jun-Hao Chen , Ming-Kai Hung , Yun-Cheng Tsai and more
Potential Business Impact:
Helps predict money changes faster and better.
We compare two quantum sequence models, QLSTM and QFWP, under an Equal Parameter Count (EPC) and adjoint differentiation setup on daily EUR USD forecasting as a controlled one dimensional time series case study. Across 10 random seeds and batch sizes from 4 to 64, we measure component wise runtimes including train forward, backward, full train, and inference, as well as accuracy including RMSE and directional accuracy. Batched forward scales well by about 2.2 to 2.4 times, but backward scales modestly, with QLSTM about 1.01 to 1.05 times and QFWP about 1.18 to 1.22 times, which caps end to end training speedups near 2 times. QFWP achieves lower RMSE and higher directional accuracy at all batch sizes, supported by a Wilcoxon test with p less than or equal to 0.004 and a large Cliff delta, while QLSTM reaches the highest throughput at batch size 64, revealing a clear speed accuracy Pareto frontier. We provide an EPC aligned, numerically checked benchmarking pipeline and practical guidance on batch size choices, while broader datasets and hardware and noise settings are left for future work.
Similar Papers
Benchmarking Quantum and Classical Sequential Models for Urban Telecommunication Forecasting
Quantum Physics
Helps predict phone messages using new computer tricks.
QL-LSTM: A Parameter-Efficient LSTM for Stable Long-Sequence Modeling
Machine Learning (CS)
Makes AI remember longer, use less computer power.
Quantum Temporal Convolutional Neural Networks for Cross-Sectional Equity Return Prediction: A Comparative Benchmark Study
Machine Learning (CS)
Quantum computer predicts stock prices much better.