Multitask Learning with Stochastic Interpolants
By: Hugo Negrel , Florentin Coeurdoux , Michael S. Albergo and more
Potential Business Impact:
Creates one AI that does many different jobs.
We propose a framework for learning maps between probability distributions that broadly generalizes the time dynamics of flow and diffusion models. To enable this, we generalize stochastic interpolants by replacing the scalar time variable with vectors, matrices, or linear operators, allowing us to bridge probability distributions across multiple dimensional spaces. This approach enables the construction of versatile generative models capable of fulfilling multiple tasks without task-specific training. Our operator-based interpolants not only provide a unifying theoretical perspective for existing generative models but also extend their capabilities. Through numerical experiments, we demonstrate the zero-shot efficacy of our method on conditional generation and inpainting, fine-tuning and posterior sampling, and multiscale modeling, suggesting its potential as a generic task-agnostic alternative to specialized models.
Similar Papers
Multitask Learning with Stochastic Interpolants
Machine Learning (CS)
Creates AI that learns many tasks without retraining.
Likely Interpolants of Generative Models
Machine Learning (CS)
Makes AI create smoother, more realistic images.
Physics-aware generative models for turbulent fluid flows through energy-consistent stochastic interpolants
Computational Engineering, Finance, and Science
Makes computer weather forecasts faster and more accurate.