Probabilistic Graph Cuts
By: Ayoub Ghriss
Potential Business Impact:
Helps computers group similar things together better.
Probabilistic relaxations of graph cuts offer a differentiable alternative to spectral clustering, enabling end-to-end and online learning without eigendecompositions, yet prior work centered on RatioCut and lacked general guarantees and principled gradients. We present a unified probabilistic framework that covers a wide class of cuts, including Normalized Cut. Our framework provides tight analytic upper bounds on expected discrete cuts via integral representations and Gauss hypergeometric functions with closed-form forward and backward. Together, these results deliver a rigorous, numerically stable foundation for scalable, differentiable graph partitioning covering a wide range of clustering and contrastive learning objectives.
Similar Papers
Probabilistic Graph Cuts
Machine Learning (CS)
Helps computers group similar things together better.
Graph-Regularized Learning of Gaussian Mixture Models
Machine Learning (CS)
Shares computer learning without sharing private data.
Neural Normalized Cut: A Differential and Generalizable Approach for Spectral Clustering
Machine Learning (CS)
Finds groups in huge amounts of data faster.