Score: 1

Generalization bounds for score-based generative models: a synthetic proof

Published: July 7, 2025 | arXiv ID: 2507.04794v1

By: Arthur Stéphanovitch, Eddie Aamari, Clément Levrard

Potential Business Impact:

Makes AI create realistic pictures from descriptions.

Business Areas:
A/B Testing Data and Analytics

We establish minimax convergence rates for score-based generative models (SGMs) under the $1$-Wasserstein distance. Assuming the target density $p^\star$ lies in a nonparametric $\beta$-smooth H\"older class with either compact support or subGaussian tails on $\mathbb{R}^d$, we prove that neural network-based score estimators trained via denoising score matching yield generative models achieving rate $n^{-(\beta+1)/(2\beta+d)}$ up to polylogarithmic factors. Our unified analysis handles arbitrary smoothness $\beta > 0$, supports both deterministic and stochastic samplers, and leverages shape constraints on $p^\star$ to induce regularity of the score. The resulting proofs are more concise, and grounded in generic stability of diffusions and standard approximation theory.

Country of Origin
🇫🇷 France

Page Count
41 pages

Category
Mathematics:
Statistics Theory