One-Step Diffusion Samplers via Self-Distillation and Deterministic Flow
By: Pascal Jutras-Dube , Jiaru Zhang , Ziran Wang and more
Sampling from unnormalized target distributions is a fundamental yet challenging task in machine learning and statistics. Existing sampling algorithms typically require many iterative steps to produce high-quality samples, leading to high computational costs. We introduce one-step diffusion samplers which learn a step-conditioned ODE so that one large step reproduces the trajectory of many small ones via a state-space consistency loss. We further show that standard ELBO estimates in diffusion samplers degrade in the few-step regime because common discrete integrators yield mismatched forward/backward transition kernels. Motivated by this analysis, we derive a deterministic-flow (DF) importance weight for ELBO estimation without a backward kernel. To calibrate DF, we introduce a volume-consistency regularization that aligns the accumulated volume change along the flow across step resolutions. Our proposed sampler therefore achieves both sampling and stable evidence estimate in only one or few steps. Across challenging synthetic and Bayesian benchmarks, it achieves competitive sample quality with orders-of-magnitude fewer network evaluations while maintaining robust ELBO estimates.
Similar Papers
Joint Distillation for Fast Likelihood Evaluation and Sampling in Flow-based Models
Machine Learning (CS)
Makes AI create better pictures faster.
Diffusion Models are Molecular Dynamics Simulators
Machine Learning (CS)
Simulates molecules moving like real life.
Diffusion Models are Molecular Dynamics Simulators
Machine Learning (CS)
Simulates molecules by learning from pictures.