Minimax Optimality of the Probability Flow ODE for Diffusion Models
By: Changxiao Cai, Gen Li
Potential Business Impact:
Makes AI create realistic pictures more accurately.
Score-based diffusion models have become a foundational paradigm for modern generative modeling, demonstrating exceptional capability in generating samples from complex high-dimensional distributions. Despite the dominant adoption of probability flow ODE-based samplers in practice due to their superior sampling efficiency and precision, rigorous statistical guarantees for these methods have remained elusive in the literature. This work develops the first end-to-end theoretical framework for deterministic ODE-based samplers that establishes near-minimax optimal guarantees under mild assumptions on target data distributions. Specifically, focusing on subgaussian distributions with $\beta$-H\"older smooth densities for $\beta\leq 2$, we propose a smooth regularized score estimator that simultaneously controls both the $L^2$ score error and the associated mean Jacobian error. Leveraging this estimator within a refined convergence analysis of the ODE-based sampling process, we demonstrate that the resulting sampler achieves the minimax rate in total variation distance, modulo logarithmic factors. Notably, our theory comprehensively accounts for all sources of error in the sampling process and does not require strong structural conditions such as density lower bounds or Lipschitz/smooth scores on target distributions, thereby covering a broad range of practical data distributions.
Similar Papers
Fast Convergence for High-Order ODE Solvers in Diffusion Probabilistic Models
Machine Learning (CS)
Makes AI create realistic pictures faster and better.
Non-asymptotic error bounds for probability flow ODEs under weak log-concavity
Machine Learning (Stat)
Makes AI create realistic pictures from any data.
Geometric Regularity in Deterministic Sampling of Diffusion-based Generative Models
Machine Learning (CS)
Makes AI create better pictures faster.