Score: 2

Continuously Tempered Diffusion Samplers

Published: August 30, 2025 | arXiv ID: 2509.00316v1

By: Ezra Erives , Bowen Jing , Peter Holderrieth and more

BigTech Affiliations: Massachusetts Institute of Technology

Potential Business Impact:

Makes computer learning explore better for better results.

Business Areas:
A/B Testing Data and Analytics

Annealing-based neural samplers seek to amortize sampling from unnormalized distributions by training neural networks to transport a family of densities interpolating from source to target. A crucial design choice in the training phase of such samplers is the proposal distribution by which locations are generated at which to evaluate the loss. Previous work has obtained such a proposal distribution by combining a partially learned transport with annealed Langevin dynamics. However, isolated modes and other pathological properties of the annealing path imply that such proposals achieve insufficient exploration and thereby lower performance post training. To remedy this, we propose continuously tempered diffusion samplers, which leverage exploration techniques developed in the context of molecular dynamics to improve proposal distributions. Specifically, a family of distributions across different temperatures is introduced to lower energy barriers at higher temperatures and drive exploration at the lower temperature of interest. We empirically validate improved sampler performance driven by extended exploration. Code is available at https://github.com/eje24/ctds.

Country of Origin
🇺🇸 United States

Repos / Data Links

Page Count
22 pages

Category
Computer Science:
Machine Learning (CS)