Score: 0

Flows and Diffusions on the Neural Manifold

Published: July 14, 2025 | arXiv ID: 2507.10623v1

By: Daniel Saragih, Deyu Cao, Tejas Balaji

Potential Business Impact:

Makes AI learn better and find bad AI behavior.

Business Areas:
Machine Learning Artificial Intelligence, Data and Analytics, Software

Diffusion and flow-based generative models have achieved remarkable success in domains such as image synthesis, video generation, and natural language modeling. In this work, we extend these advances to weight space learning by leveraging recent techniques to incorporate structural priors derived from optimization dynamics. Central to our approach is modeling the trajectory induced by gradient descent as a trajectory inference problem. We unify several trajectory inference techniques under the framework of gradient flow matching, providing a theoretical framework for treating optimization paths as inductive bias. We further explore architectural and algorithmic choices, including reward fine-tuning by adjoint matching, the use of autoencoders for latent weight representation, conditioning on task-specific context data, and adopting informative source distributions such as Kaiming uniform. Experiments demonstrate that our method matches or surpasses baselines in generating in-distribution weights, improves initialization for downstream training, and supports fine-tuning to enhance performance. Finally, we illustrate a practical application in safety-critical systems: detecting harmful covariate shifts, where our method outperforms the closest comparable baseline.

Country of Origin
🇨🇦 Canada

Page Count
40 pages

Category
Computer Science:
Machine Learning (CS)