Score: 2

Stabilizing Self-Consuming Diffusion Models with Latent Space Filtering

Published: November 16, 2025 | arXiv ID: 2511.12742v1

By: Zhongteng Cai , Yaxuan Wang , Yang Liu and more

Potential Business Impact:

Filters bad computer-made pictures to improve AI.

Business Areas:
Predictive Analytics Artificial Intelligence, Data and Analytics, Software

As synthetic data proliferates across the Internet, it is often reused to train successive generations of generative models. This creates a ``self-consuming loop" that can lead to training instability or \textit{model collapse}. Common strategies to address the issue -- such as accumulating historical training data or injecting fresh real data -- either increase computational cost or require expensive human annotation. In this paper, we empirically analyze the latent space dynamics of self-consuming diffusion models and observe that the low-dimensional structure of latent representations extracted from synthetic data degrade over generations. Based on this insight, we propose \textit{Latent Space Filtering} (LSF), a novel approach that mitigates model collapse by filtering out less realistic synthetic data from mixed datasets. Theoretically, we present a framework that connects latent space degradation to empirical observations. Experimentally, we show that LSF consistently outperforms existing baselines across multiple real-world datasets, effectively mitigating model collapse without increasing training cost or relying on human annotation.

Country of Origin
🇺🇸 United States

Repos / Data Links

Page Count
13 pages

Category
Computer Science:
Machine Learning (CS)