Score: 1

Multitask Learning with Stochastic Interpolants

Published: August 6, 2025 | arXiv ID: 2508.04605v1

By: Hugo Negrel , Florentin Coeurdoux , Michael S. Albergo and more

Potential Business Impact:

Creates AI that learns many tasks without retraining.

We propose a framework for learning maps between probability distributions that broadly generalizes the time dynamics of flow and diffusion models. To enable this, we generalize stochastic interpolants by replacing the scalar time variable with vectors, matrices, or linear operators, allowing us to bridge probability distributions across multiple dimensional spaces. This approach enables the construction of versatile generative models capable of fulfilling multiple tasks without task-specific training. Our operator-based interpolants not only provide a unifying theoretical perspective for existing generative models but also extend their capabilities. Through numerical experiments, we demonstrate the zero-shot efficacy of our method on conditional generation and inpainting, fine-tuning and posterior sampling, and multiscale modeling, suggesting its potential as a generic task-agnostic alternative to specialized models.

Country of Origin
🇺🇸 United States

Page Count
22 pages

Category
Computer Science:
Machine Learning (CS)