Convergence of Deterministic and Stochastic Diffusion-Model Samplers: A Simple Analysis in Wasserstein Distance
By: Eliot Beyler, Francis Bach
Potential Business Impact:
Makes AI create better pictures by fixing math.
We provide new convergence guarantees in Wasserstein distance for diffusion-based generative models, covering both stochastic (DDPM-like) and deterministic (DDIM-like) sampling methods. We introduce a simple framework to analyze discretization, initialization, and score estimation errors. Notably, we derive the first Wasserstein convergence bound for the Heun sampler and improve existing results for the Euler sampler of the probability flow ODE. Our analysis emphasizes the importance of spatial regularity of the learned score function and argues for controlling the score error with respect to the true reverse process, in line with denoising score matching. We also incorporate recent results on smoothed Wasserstein distances to sharpen initialization error bounds.
Similar Papers
Assessing the Quality of Denoising Diffusion Models in Wasserstein Distance: Noisy Score and Optimal Bounds
Machine Learning (Stat)
Makes AI create better pictures from messy data.
Optimal Convergence Analysis of DDPM for General Distributions
Machine Learning (Stat)
Makes AI create better pictures faster.
Wasserstein Convergence of Critically Damped Langevin Diffusions
Statistics Theory
Makes AI create better pictures by adding noise.