Likely Interpolants of Generative Models
By: Frederik Möbius Rygaard , Shen Zhu , Yinzhu Jin and more
Potential Business Impact:
Makes AI create smoother, more realistic images.
Interpolation in generative models allows for controlled generation, model inspection, and more. Unfortunately, most generative models lack a principal notion of interpolants without restrictive assumptions on either the model or data dimension. In this paper, we develop a general interpolation scheme that targets likely transition paths compatible with different metrics and probability distributions. We consider interpolants analogous to a geodesic constrained to a suitable data distribution and derive a novel algorithm for computing these curves, which requires no additional training. Theoretically, we show that our method locally can be considered as a geodesic under a suitable Riemannian metric. We quantitatively show that our interpolation scheme traverses higher density regions than baselines across a range of models and datasets.
Similar Papers
Multitask Learning with Stochastic Interpolants
Machine Learning (CS)
Creates AI that learns many tasks without retraining.
Multitask Learning with Stochastic Interpolants
Machine Learning (CS)
Creates one AI that does many different jobs.
Physics-aware generative models for turbulent fluid flows through energy-consistent stochastic interpolants
Computational Engineering, Finance, and Science
Makes computer weather forecasts faster and more accurate.