Generative Modeling with Manifold Percolation
By: Rui Tong
Potential Business Impact:
Teaches computers to draw new, diverse pictures.
Generative modeling is typically framed as learning mapping rules, but from an observer's perspective without access to these rules, the task manifests as disentangling the geometric support from the probability distribution. We propose that Continuum Percolation is uniquely suited for this support analysis, as the sampling process effectively projects high-dimensional density estimation onto a geometric counting problem on the support. In this work, we establish a rigorous isomorphism between the topological phase transitions of Random Geometric Graphs and the underlying data manifold in high-dimensional space. By analyzing the relationship between our proposed Percolation Shift metric and FID, we demonstrate that our metric captures structural pathologies (such as implicit mode collapse) where statistical metrics fail. Finally, we translate this topological phenomenon into a differentiable loss function to guide training. Experimental results confirm that this approach not only prevents manifold shrinkage but drives the model toward a state of "Hyper-Generalization," achieving good fidelity and verified topological expansion.
Similar Papers
Manifold Percolation: from generative model to Reinforce learning
Machine Learning (Stat)
Makes AI create more realistic and varied images.
Learning Geometry: A Framework for Building Adaptive Manifold Models through Metric Optimization
Machine Learning (CS)
Teaches computers to learn by changing their shape.
Generative Learning of Densities on Manifolds
Machine Learning (CS)
Creates new realistic pictures from simple ideas.