Score: 1

On Flow Matching KL Divergence

Published: November 7, 2025 | arXiv ID: 2511.05480v1

By: Maojiang Su , Jerry Yao-Chieh Hu , Sophia Pi and more

Potential Business Impact:

Makes AI learn data more accurately and faster.

Business Areas:
A/B Testing Data and Analytics

We derive a deterministic, non-asymptotic upper bound on the Kullback-Leibler (KL) divergence of the flow-matching distribution approximation. In particular, if the $L_2$ flow-matching loss is bounded by $\epsilon^2 > 0$, then the KL divergence between the true data distribution and the estimated distribution is bounded by $A_1 \epsilon + A_2 \epsilon^2$. Here, the constants $A_1$ and $A_2$ depend only on the regularities of the data and velocity fields. Consequently, this bound implies statistical convergence rates of Flow Matching Transformers under the Total Variation (TV) distance. We show that, flow matching achieves nearly minimax-optimal efficiency in estimating smooth distributions. Our results make the statistical efficiency of flow matching comparable to that of diffusion models under the TV distance. Numerical studies on synthetic and learned velocities corroborate our theory.

Country of Origin
🇺🇸 United States

Repos / Data Links

Page Count
30 pages

Category
Computer Science:
Machine Learning (CS)