Assessing the Quality of Denoising Diffusion Models in Wasserstein Distance: Noisy Score and Optimal Bounds
By: Vahan Arsenyan, Elen Vardanyan, Arnak Dalalyan
Potential Business Impact:
Makes AI create better pictures from messy data.
Generative modeling aims to produce new random examples from an unknown target distribution, given access to a finite collection of examples. Among the leading approaches, denoising diffusion probabilistic models (DDPMs) construct such examples by mapping a Brownian motion via a diffusion process driven by an estimated score function. In this work, we first provide empirical evidence that DDPMs are robust to constant-variance noise in the score evaluations. We then establish finite-sample guarantees in Wasserstein-2 distance that exhibit two key features: (i) they characterize and quantify the robustness of DDPMs to noisy score estimates, and (ii) they achieve faster convergence rates than previously known results. Furthermore, we observe that the obtained rates match those known in the Gaussian case, implying their optimality.
Similar Papers
Convergence of Deterministic and Stochastic Diffusion-Model Samplers: A Simple Analysis in Wasserstein Distance
Machine Learning (CS)
Makes AI create better pictures by fixing math.
Dimension-Free Convergence of Diffusion Models for Approximate Gaussian Mixtures
Machine Learning (CS)
Makes AI create realistic pictures faster.
Optimal Convergence Analysis of DDPM for General Distributions
Machine Learning (Stat)
Makes AI create better pictures faster.