Score: 1

DiffQ: Unified Parameter Initialization for Variational Quantum Algorithms via Diffusion Models

Published: September 22, 2025 | arXiv ID: 2509.17324v1

By: Chi Zhang , Mengxin Zheng , Qian Lou and more

Potential Business Impact:

Makes quantum computers learn faster and better.

Business Areas:
Quantum Computing Science and Engineering

Variational Quantum Algorithms (VQAs) are widely used in the noisy intermediate-scale quantum (NISQ) era, but their trainability and performance depend critically on initialization parameters that shape the optimization landscape. Existing machine learning-based initializers achieve state-of-the-art results yet remain constrained to single-task domains and small datasets of only hundreds of samples. We address these limitations by reformulating VQA parameter initialization as a generative modeling problem and introducing DiffQ, a parameter initializer based on the Denoising Diffusion Probabilistic Model (DDPM). To support robust training and evaluation, we construct a dataset of 15,085 instances spanning three domains and five representative tasks. Experiments demonstrate that DiffQ surpasses baselines, reducing initial loss by up to 8.95 and convergence steps by up to 23.4%.

Page Count
5 pages

Category
Computer Science:
Emerging Technologies