Score: 1

MDNS: Masked Diffusion Neural Sampler via Stochastic Optimal Control

Published: August 14, 2025 | arXiv ID: 2508.10684v1

By: Yuchen Zhu , Wei Guo , Jaemoo Choi and more

BigTech Affiliations: Meta

Potential Business Impact:

Creates computer programs that guess better from many choices.

We study the problem of learning a neural sampler to generate samples from discrete state spaces where the target probability mass function $\pi\propto\mathrm{e}^{-U}$ is known up to a normalizing constant, which is an important task in fields such as statistical physics, machine learning, combinatorial optimization, etc. To better address this challenging task when the state space has a large cardinality and the distribution is multi-modal, we propose $\textbf{M}$asked $\textbf{D}$iffusion $\textbf{N}$eural $\textbf{S}$ampler ($\textbf{MDNS}$), a novel framework for training discrete neural samplers by aligning two path measures through a family of learning objectives, theoretically grounded in the stochastic optimal control of the continuous-time Markov chains. We validate the efficiency and scalability of MDNS through extensive experiments on various distributions with distinct statistical properties, where MDNS learns to accurately sample from the target distributions despite the extremely high problem dimensions and outperforms other learning-based baselines by a large margin. A comprehensive study of ablations and extensions is also provided to demonstrate the efficacy and potential of the proposed framework.

Country of Origin
🇺🇸 United States

Page Count
42 pages

Category
Computer Science:
Machine Learning (CS)