Score-Based Deterministic Density Sampling
By: Vasily Ilin, Peter Sushko, Jingwei Hu
Potential Business Impact:
Makes computers create realistic pictures faster.
We propose a deterministic sampling framework using Score-Based Transport Modeling for sampling an unnormalized target density $\pi$ given only its score $\nabla \log \pi$. Our method approximates the Wasserstein gradient flow on $\mathrm{KL}(f_t\|\pi)$ by learning the time-varying score $\nabla \log f_t$ on the fly using score matching. While having the same marginal distribution as Langevin dynamics, our method produces smooth deterministic trajectories, resulting in monotone noise-free convergence. We prove that our method dissipates relative entropy at the same rate as the exact gradient flow, provided sufficient training. Numerical experiments validate our theoretical findings: our method converges at the optimal rate, has smooth trajectories, and is usually more sample efficient than its stochastic counterpart. Experiments on high dimensional image data show that our method produces high quality generations in as few as 15 steps and exhibits natural exploratory behavior. The memory and runtime scale linearly in the sample size.
Similar Papers
The Effect of Stochasticity in Score-Based Diffusion Sampling: a KL Divergence Analysis
Machine Learning (CS)
Makes AI art generation more accurate and controllable.
Malliavin Calculus for Score-based Diffusion Models
Machine Learning (CS)
Makes AI create realistic images and sounds.
Optimal estimation of a factorizable density using diffusion models with ReLU neural networks
Statistics Theory
Helps AI learn patterns from less data.