Score: 0

Manifold Percolation: from generative model to Reinforce learning

Published: November 25, 2025 | arXiv ID: 2511.20503v2

By: Rui Tong

Potential Business Impact:

Makes AI create more realistic and varied images.

Business Areas:
Machine Learning Artificial Intelligence, Data and Analytics, Software

Generative modeling is typically framed as learning mapping rules, but from an observer's perspective without access to these rules, the task becomes disentangling the geometric support from the probability distribution. We propose that continuum percolation is uniquely suited to this support analysis, as the sampling process effectively projects high-dimensional density estimation onto a geometric counting problem on the support. In this work, we establish a rigorous correspondence between the topological phase transitions of random geometric graphs and the underlying data manifold in high-dimensional space. By analyzing the relationship between our proposed Percolation Shift metric and FID, we show that this metric captures structural pathologies, such as implicit mode collapse, where standard statistical metrics fail. Finally, we translate this topological phenomenon into a differentiable loss function that guides training. Experimental results confirm that this approach not only prevents manifold shrinkage but also fosters a form of synergistic improvement, where topological stability becomes a prerequisite for sustained high fidelity in both static generation and sequential decision making.

Country of Origin
🇬🇧 United Kingdom

Page Count
26 pages

Category
Statistics:
Machine Learning (Stat)