DiffQ: Unified Parameter Initialization for Variational Quantum Algorithms via Diffusion Models
By: Chi Zhang , Mengxin Zheng , Qian Lou and more
Potential Business Impact:
Makes quantum computers learn faster and better.
Variational Quantum Algorithms (VQAs) are widely used in the noisy intermediate-scale quantum (NISQ) era, but their trainability and performance depend critically on initialization parameters that shape the optimization landscape. Existing machine learning-based initializers achieve state-of-the-art results yet remain constrained to single-task domains and small datasets of only hundreds of samples. We address these limitations by reformulating VQA parameter initialization as a generative modeling problem and introducing DiffQ, a parameter initializer based on the Denoising Diffusion Probabilistic Model (DDPM). To support robust training and evaluation, we construct a dataset of 15,085 instances spanning three domains and five representative tasks. Experiments demonstrate that DiffQ surpasses baselines, reducing initial loss by up to 8.95 and convergence steps by up to 23.4%.
Similar Papers
Overcoming Dimensional Factorization Limits in Discrete Diffusion Models through Quantum Joint Distribution Learning
Quantum Physics
Makes computers create complex things in one step.
VQEzy: An Open-Source Dataset for Parameter Initialization in Variational Quantum Eigensolvers
Machine Learning (CS)
Helps quantum computers solve problems faster.
Mitigating Barren plateaus in quantum denoising diffusion probabilistic models
Machine Learning (CS)
Fixes quantum computers to learn better.