Generative modeling using evolved quantum Boltzmann machines
By: Mark M. Wilde
Potential Business Impact:
Teaches quantum computers to learn hard patterns.
Born-rule generative modeling, a central task in quantum machine learning, seeks to learn probability distributions that can be efficiently sampled by measuring complex quantum states. One hope is for quantum models to efficiently capture probability distributions that are difficult to learn and simulate by classical means alone. Quantum Boltzmann machines were proposed about one decade ago for this purpose, yet efficient training methods have remained elusive. In this paper, I overcome this obstacle by proposing a practical solution that trains quantum Boltzmann machines for Born-rule generative modeling. Two key ingredients in the proposal are the Donsker-Varadhan variational representation of the classical relative entropy and the quantum Boltzmann gradient estimator of [Patel et al., arXiv:2410.12935]. I present the main result for a more general ansatz known as an evolved quantum Boltzmann machine [Minervini et al., arXiv:2501.03367], which combines parameterized real- and imaginary-time evolution. I also show how to extend the findings to other distinguishability measures beyond relative entropy. Finally, I present four different hybrid quantum-classical algorithms for the minimax optimization underlying training, and I discuss their theoretical convergence guarantees.
Similar Papers
Quantum-Boosted High-Fidelity Deep Learning
Machine Learning (CS)
Helps computers understand complex science data better.
Generative quantum advantage for classical and quantum problems
Quantum Physics
Quantum computers learn and create things impossible for regular computers.
Quantum latent distributions in deep generative models
Machine Learning (CS)
Quantum computers help AI make better pictures.