Learning Boltzmann Generators via Constrained Mass Transport
By: Christopher von Klitzing , Denis Blessing , Henrik Schopmans and more
Potential Business Impact:
Helps computers learn how molecules move better.
Efficient sampling from high-dimensional and multimodal unnormalized probability distributions is a central challenge in many areas of science and machine learning. We focus on Boltzmann generators (BGs) that aim to sample the Boltzmann distribution of physical systems, such as molecules, at a given temperature. Classical variational approaches that minimize the reverse Kullback-Leibler divergence are prone to mode collapse, while annealing-based methods, commonly using geometric schedules, can suffer from mass teleportation and rely heavily on schedule tuning. We introduce Constrained Mass Transport (CMT), a variational framework that generates intermediate distributions under constraints on both the KL divergence and the entropy decay between successive steps. These constraints enhance distributional overlap, mitigate mass teleportation, and counteract premature convergence. Across standard BG benchmarks and the here introduced ELIL tetrapeptide, the largest system studied to date without access to samples from molecular dynamics, CMT consistently surpasses state-of-the-art variational methods, achieving more than 2.5x higher effective sample size while avoiding mode collapse.
Similar Papers
Unlocking the Power of Boltzmann Machines by Parallelizable Sampler and Efficient Temperature Estimation
Machine Learning (CS)
Makes smart computers learn faster and better.
Generative modeling using evolved quantum Boltzmann machines
Quantum Physics
Teaches quantum computers to learn hard patterns.
Quantum-Boosted High-Fidelity Deep Learning
Machine Learning (CS)
Helps computers understand complex science data better.