Flow Matching in the Low-Noise Regime: Pathologies and a Contrastive Remedy
By: Weili Zeng, Yichao Yan
Potential Business Impact:
Fixes AI image making to be more stable.
Flow matching has recently emerged as a powerful alternative to diffusion models, providing a continuous-time formulation for generative modeling and representation learning. Yet, we show that this framework suffers from a fundamental instability in the low-noise regime. As noise levels approach zero, arbitrarily small perturbations in the input can induce large variations in the velocity target, causing the condition number of the learning problem to diverge. This ill-conditioning not only slows optimization but also forces the encoder to reallocate its limited Jacobian capacity toward noise directions, thereby degrading semantic representations. We provide the first theoretical analysis of this phenomenon, which we term the low-noise pathology, establishing its intrinsic link to the structure of the flow matching objective. Building on these insights, we propose Local Contrastive Flow (LCF), a hybrid training protocol that replaces direct velocity regression with contrastive feature alignment at small noise levels, while retaining standard flow matching at moderate and high noise. Empirically, LCF not only improves convergence speed but also stabilizes representation quality. Our findings highlight the critical importance of addressing low-noise pathologies to unlock the full potential of flow matching for both generation and representation learning.
Similar Papers
On the Closed-Form of Flow Matching: Generalization Does Not Arise from Target Stochasticity
Machine Learning (CS)
Makes AI create better pictures by simplifying its learning.
Low-Field Magnetic Resonance Image Quality Enhancement using a Conditional Flow Matching Model
CV and Pattern Recognition
Makes blurry MRI scans clear like expensive ones.
Contrastive Flow Matching
CV and Pattern Recognition
Makes AI create better pictures faster.