Score: 0

Analysis of Semi-Supervised Learning on Hypergraphs

Published: October 29, 2025 | arXiv ID: 2510.25354v1

By: Adrien Weihs, Andrea Bertozzi, Matthew Thorpe

Potential Business Impact:

Helps computers learn from complex group connections.

Business Areas:
A/B Testing Data and Analytics

Hypergraphs provide a natural framework for modeling higher-order interactions, yet their theoretical underpinnings in semi-supervised learning remain limited. We provide an asymptotic consistency analysis of variational learning on random geometric hypergraphs, precisely characterizing the conditions ensuring the well-posedness of hypergraph learning as well as showing convergence to a weighted $p$-Laplacian equation. Motivated by this, we propose Higher-Order Hypergraph Learning (HOHL), which regularizes via powers of Laplacians from skeleton graphs for multiscale smoothness. HOHL converges to a higher-order Sobolev seminorm. Empirically, it performs strongly on standard baselines.

Page Count
52 pages

Category
Computer Science:
Machine Learning (CS)