Analysis of Semi-Supervised Learning on Hypergraphs
By: Adrien Weihs, Andrea Bertozzi, Matthew Thorpe
Potential Business Impact:
Helps computers learn from complex group connections.
Hypergraphs provide a natural framework for modeling higher-order interactions, yet their theoretical underpinnings in semi-supervised learning remain limited. We provide an asymptotic consistency analysis of variational learning on random geometric hypergraphs, precisely characterizing the conditions ensuring the well-posedness of hypergraph learning as well as showing convergence to a weighted $p$-Laplacian equation. Motivated by this, we propose Higher-Order Hypergraph Learning (HOHL), which regularizes via powers of Laplacians from skeleton graphs for multiscale smoothness. HOHL converges to a higher-order Sobolev seminorm. Empirically, it performs strongly on standard baselines.
Similar Papers
Higher-Order Regularization Learning on Hypergraphs
Machine Learning (CS)
Teaches computers to learn from complex connections.
Scalable Sample-to-Population Estimation of Hyperbolic Space Models for Hypergraphs
Methodology
Finds hidden groups in complex connections.
Scalable Hypergraph Structure Learning with Diverse Smoothness Priors
Machine Learning (CS)
Finds hidden connections in complex networks.