Score: 1

Assessing the Quality of Denoising Diffusion Models in Wasserstein Distance: Noisy Score and Optimal Bounds

Published: June 11, 2025 | arXiv ID: 2506.09681v1

By: Vahan Arsenyan, Elen Vardanyan, Arnak Dalalyan

Potential Business Impact:

Makes AI create better pictures from messy data.

Business Areas:
A/B Testing Data and Analytics

Generative modeling aims to produce new random examples from an unknown target distribution, given access to a finite collection of examples. Among the leading approaches, denoising diffusion probabilistic models (DDPMs) construct such examples by mapping a Brownian motion via a diffusion process driven by an estimated score function. In this work, we first provide empirical evidence that DDPMs are robust to constant-variance noise in the score evaluations. We then establish finite-sample guarantees in Wasserstein-2 distance that exhibit two key features: (i) they characterize and quantify the robustness of DDPMs to noisy score estimates, and (ii) they achieve faster convergence rates than previously known results. Furthermore, we observe that the obtained rates match those known in the Gaussian case, implying their optimality.

Repos / Data Links

Page Count
40 pages

Category
Statistics:
Machine Learning (Stat)