Be Tangential to Manifold: Discovering Riemannian Metric for Diffusion Models
By: Shinnosuke Saito, Takashi Matsubara
Potential Business Impact:
Makes AI-generated pictures smoothly change.
Diffusion models are powerful deep generative models (DGMs) that generate high-fidelity, diverse content. However, unlike classical DGMs, they lack an explicit, tractable low-dimensional latent space that parameterizes the data manifold. This absence limits manifold-aware analysis and operations, such as interpolation and editing. Existing interpolation methods for diffusion models typically follow paths through high-density regions, which are not necessarily aligned with the data manifold and can yield perceptually unnatural transitions. To exploit the data manifold learned by diffusion models, we propose a novel Riemannian metric on the noise space, inspired by recent findings that the Jacobian of the score function captures the tangent spaces to the local data manifold. This metric encourages geodesics in the noise space to stay within or run parallel to the learned data manifold. Experiments on image interpolation show that our metric produces perceptually more natural and faithful transitions than existing density-based and naive baselines.
Similar Papers
What's Inside Your Diffusion Model? A Score-Based Riemannian Metric to Explore the Data Manifold
Machine Learning (CS)
Makes AI create smoother, more realistic image changes.
Image Interpolation with Score-based Riemannian Metrics of Diffusion Models
CV and Pattern Recognition
Makes AI art smoother and more realistic.
Generative Learning of Densities on Manifolds
Machine Learning (CS)
Creates new realistic pictures from simple ideas.