Generalization bounds for score-based generative models: a synthetic proof
By: Arthur Stéphanovitch, Eddie Aamari, Clément Levrard
Potential Business Impact:
Makes AI create realistic pictures from descriptions.
We establish minimax convergence rates for score-based generative models (SGMs) under the $1$-Wasserstein distance. Assuming the target density $p^\star$ lies in a nonparametric $\beta$-smooth H\"older class with either compact support or subGaussian tails on $\mathbb{R}^d$, we prove that neural network-based score estimators trained via denoising score matching yield generative models achieving rate $n^{-(\beta+1)/(2\beta+d)}$ up to polylogarithmic factors. Our unified analysis handles arbitrary smoothness $\beta > 0$, supports both deterministic and stochastic samplers, and leverages shape constraints on $p^\star$ to induce regularity of the score. The resulting proofs are more concise, and grounded in generic stability of diffusions and standard approximation theory.
Similar Papers
Approximation and Generalization Abilities of Score-based Neural Network Generative Models for Sub-Gaussian Distributions
Machine Learning (CS)
Helps computers learn to create realistic images.
Wasserstein Convergence of Score-based Generative Models under Semiconvexity and Discontinuous Gradients
Machine Learning (CS)
Makes AI create realistic images from messy data.
Algorithm- and Data-Dependent Generalization Bounds for Score-Based Generative Models
Machine Learning (Stat)
Helps AI learn to make better, more realistic pictures.