Probabilistic Computers for Neural Quantum States
By: Shuvro Chowdhury , Jasper Pieterse , Navid Anjum Aadit and more
Neural quantum states efficiently represent many-body wavefunctions with neural networks, but the cost of Monte Carlo sampling limits their scaling to large system sizes. Here we address this challenge by combining sparse Boltzmann machine architectures with probabilistic computing hardware. We implement a probabilistic computer on field programmable gate arrays (FPGAs) and use it as a fast sampler for energy-based neural quantum states. For the two-dimensional transverse-field Ising model at criticality, we obtain accurate ground-state energies for lattices up to 80 $\times$ 80 (6400 spins) using a custom multi-FPGA cluster. Furthermore, we introduce a dual-sampling algorithm to train deep Boltzmann machines, replacing intractable marginalization with conditional sampling over auxiliary layers. This enables the training of sparse deep models and improves parameter efficiency relative to shallow networks. Using this algorithm, we train deep Boltzmann machines for a system with 35 $\times$ 35 (1225 spins). Together, these results demonstrate that probabilistic hardware can overcome the sampling bottleneck in variational simulation of quantum many-body systems, opening a path to larger system sizes and deeper variational architectures.
Similar Papers
An efficient probabilistic hardware architecture for diffusion-like models
Machine Learning (CS)
Makes computers use way less power for smart tasks.
Pushing the Boundary of Quantum Advantage in Hard Combinatorial Optimization with Probabilistic Computers
Quantum Physics
Solves hard problems faster than quantum computers.
Quantum-Boosted High-Fidelity Deep Learning
Machine Learning (CS)
Helps computers understand complex science data better.