Error Analysis of Discrete Flow with Generator Matching
By: Zhengyan Wan , Yidong Ouyang , Qiang Yao and more
Potential Business Impact:
Teaches computers to learn patterns better.
Discrete flow models offer a powerful framework for learning distributions over discrete state spaces and have demonstrated superior performance compared to the discrete diffusion model. However, their convergence properties and error analysis remain largely unexplored. In this work, we develop a unified framework grounded in stochastic calculus theory to systematically investigate the theoretical properties of discrete flow. Specifically, we derive the KL divergence of two path measures regarding two continuous-time Markov chains (CTMCs) with different transition rates by developing a novel Girsanov-type theorem, and provide a comprehensive analysis that encompasses the error arising from transition rate estimation and early stopping, where the first type of error has rarely been analyzed by existing works. Unlike discrete diffusion models, discrete flow incurs no truncation error caused by truncating the time horizon in the noising process. Building on generator matching and uniformization, we establish non-asymptotic error bounds for distribution estimation. Our results provide the first error analysis for discrete flow models.
Similar Papers
Foundations of Diffusion Models in General State Spaces: A Self-Contained Introduction
Machine Learning (Stat)
Unifies how computers learn from pictures and words.
Non-Asymptotic Convergence of Discrete Diffusion Models: Masked and Random Walk dynamics
Machine Learning (CS)
Makes computers create better pictures from scratch.
Branching Flows: Discrete, Continuous, and Manifold Flow Matching with Splits and Deletions
Machine Learning (Stat)
Lets computers create things of changing sizes.