Wasserstein Convergence of Critically Damped Langevin Diffusions
By: Stanislas Strasman , Sobihan Surendran , Claire Boyer and more
Potential Business Impact:
Makes AI create better pictures by adding noise.
Score-based Generative Models (SGMs) have achieved impressive performance in data generation across a wide range of applications and benefit from strong theoretical guarantees. Recently, methods inspired by statistical mechanics, in particular, Hamiltonian dynamics, have introduced Critically-damped Langevin Diffusions (CLDs), which define diffusion processes on extended spaces by coupling the data with auxiliary variables. These approaches, along with their associated score-matching and sampling procedures, have been shown to outperform standard diffusion-based samplers numerically. In this paper, we analyze a generalized dynamic that extends classical CLDs by introducing an additional hyperparameter controlling the noise applied to the data coordinate, thereby better exploiting the extended space. We further derive a novel upper bound on the sampling error of CLD-based generative models in the Wasserstein metric. This additional hyperparameter influences the smoothness of sample paths, and our discretization error analysis provides practical guidance for its tuning, leading to improved sampling performance.
Similar Papers
Score-based constrained generative modeling via Langevin diffusions with boundary conditions
Machine Learning (Stat)
Makes AI create images that follow rules.
Wasserstein Convergence of Score-based Generative Models under Semiconvexity and Discontinuous Gradients
Machine Learning (CS)
Makes AI create realistic images from messy data.
Convergence of Deterministic and Stochastic Diffusion-Model Samplers: A Simple Analysis in Wasserstein Distance
Machine Learning (CS)
Makes AI create better pictures by fixing math.