Pathway to $O(\sqrt{d})$ Complexity bound under Wasserstein metric of flow-based models
By: Xiangjun Meng, Zhongjian Wang
Potential Business Impact:
Makes AI models learn faster and better.
We provide attainable analytical tools to estimate the error of flow-based generative models under the Wasserstein metric and to establish the optimal sampling iteration complexity bound with respect to dimension as $O(\sqrt{d})$. We show this error can be explicitly controlled by two parts: the Lipschitzness of the push-forward maps of the backward flow which scales independently of the dimension; and a local discretization error scales $O(\sqrt{d})$ in terms of dimension. The former one is related to the existence of Lipschitz changes of variables induced by the (heat) flow. The latter one consists of the regularity of the score function in both spatial and temporal directions. These assumptions are valid in the flow-based generative model associated with the Föllmer process and $1$-rectified flow under the Gaussian tail assumption. As a consequence, we show that the sampling iteration complexity grows linearly with the square root of the trace of the covariance operator, which is related to the invariant distribution of the forward process.
Similar Papers
Generative Modeling with Continuous Flows: Sample Complexity of Flow Matching
Machine Learning (CS)
Makes AI create better pictures with less data.
Dimension-free error estimate for diffusion model and optimal scheduling
Machine Learning (Stat)
Makes AI create better fake pictures and sounds.
Distribution estimation via Flow Matching with Lipschitz guarantees
Machine Learning (Stat)
Makes AI learn faster and better.