Discrete State Diffusion Models: A Sample Complexity Perspective
By: Aadithya Srikanth, Mudit Gaur, Vaneet Aggarwal
Potential Business Impact:
Teaches computers to create text and lists better.
Diffusion models have demonstrated remarkable performance in generating high-dimensional samples across domains such as vision, language, and the sciences. Although continuous-state diffusion models have been extensively studied both empirically and theoretically, discrete-state diffusion models, essential for applications involving text, sequences, and combinatorial structures, remain significantly less understood from a theoretical standpoint. In particular, all existing analyses of discrete-state models assume score estimation error bounds without studying sample complexity results. In this work, we present a principled theoretical framework for discrete-state diffusion, providing the first sample complexity bound of $\widetilde{\mathcal{O}}(\epsilon^{-2})$. Our structured decomposition of the score estimation error into statistical, approximation, optimization, and clipping components offers critical insights into how discrete-state models can be trained efficiently. This analysis addresses a fundamental gap in the literature and establishes the theoretical tractability and practical relevance of discrete-state diffusion models.
Similar Papers
Foundations of Diffusion Models in General State Spaces: A Self-Contained Introduction
Machine Learning (Stat)
Unifies how computers learn from pictures and words.
Non-Asymptotic Convergence of Discrete Diffusion Models: Masked and Random Walk dynamics
Machine Learning (CS)
Makes computers create better pictures from scratch.
Disentanglement in T-space for Faster and Distributed Training of Diffusion Models with Fewer Latent-states
Machine Learning (CS)
Makes AI create pictures much faster.