Geometric Regularity in Deterministic Sampling of Diffusion-based Generative Models
By: Defang Chen , Zhenyu Zhou , Can Wang and more
Potential Business Impact:
Makes AI create better pictures faster.
Diffusion-based generative models employ stochastic differential equations (SDEs) and their equivalent probability flow ordinary differential equations (ODEs) to establish a smooth transformation between complex high-dimensional data distributions and tractable prior distributions. In this paper, we reveal a striking geometric regularity in the deterministic sampling dynamics: each simulated sampling trajectory lies within an extremely low-dimensional subspace, and all trajectories exhibit an almost identical ''boomerang'' shape, regardless of the model architecture, applied conditions, or generated content. We characterize several intriguing properties of these trajectories, particularly under closed-form solutions based on kernel-estimated data modeling. We also demonstrate a practical application of the discovered trajectory regularity by proposing a dynamic programming-based scheme to better align the sampling time schedule with the underlying trajectory structure. This simple strategy requires minimal modification to existing ODE-based numerical solvers, incurs negligible computational overhead, and achieves superior image generation performance, especially in regions with only $5 \sim 10$ function evaluations.
Similar Papers
Minimax Optimality of the Probability Flow ODE for Diffusion Models
Machine Learning (CS)
Makes AI create realistic pictures more accurately.
Fast Convergence for High-Order ODE Solvers in Diffusion Probabilistic Models
Machine Learning (CS)
Makes AI create realistic pictures faster and better.
Stochastic Transport Maps in Diffusion Models and Sampling
Probability
Creates new data that looks real.