Score: 1

Modular MeanFlow: Towards Stable and Scalable One-Step Generative Modeling

Published: August 24, 2025 | arXiv ID: 2508.17426v1

By: Haochen You, Baojing Liu, Hongyang He

Potential Business Impact:

Makes computers create realistic pictures faster.

Business Areas:
Simulation Software

One-step generative modeling seeks to generate high-quality data samples in a single function evaluation, significantly improving efficiency over traditional diffusion or flow-based models. In this work, we introduce Modular MeanFlow (MMF), a flexible and theoretically grounded approach for learning time-averaged velocity fields. Our method derives a family of loss functions based on a differential identity linking instantaneous and average velocities, and incorporates a gradient modulation mechanism that enables stable training without sacrificing expressiveness. We further propose a curriculum-style warmup schedule to smoothly transition from coarse supervision to fully differentiable training. The MMF formulation unifies and generalizes existing consistency-based and flow-matching methods, while avoiding expensive higher-order derivatives. Empirical results across image synthesis and trajectory modeling tasks demonstrate that MMF achieves competitive sample quality, robust convergence, and strong generalization, particularly under low-data or out-of-distribution settings.

Country of Origin
πŸ‡ΊπŸ‡Έ πŸ‡¬πŸ‡§ United Kingdom, United States

Page Count
15 pages

Category
Computer Science:
Machine Learning (CS)