Higher-Order Regularization Learning on Hypergraphs
By: Adrien Weihs, Andrea Bertozzi, Matthew Thorpe
Potential Business Impact:
Teaches computers to learn from complex connections.
Higher-Order Hypergraph Learning (HOHL) was recently introduced as a principled alternative to classical hypergraph regularization, enforcing higher-order smoothness via powers of multiscale Laplacians induced by the hypergraph structure. Prior work established the well- and ill-posedness of HOHL through an asymptotic consistency analysis in geometric settings. We extend this theoretical foundation by proving the consistency of a truncated version of HOHL and deriving explicit convergence rates when HOHL is used as a regularizer in fully supervised learning. We further demonstrate its strong empirical performance in active learning and in datasets lacking an underlying geometric structure, highlighting HOHL's versatility and robustness across diverse learning settings.
Similar Papers
Analysis of Semi-Supervised Learning on Hypergraphs
Machine Learning (CS)
Helps computers learn from complex group connections.
High-order Regularization for Machine Learning and Learning-based Control
Machine Learning (CS)
Makes smart computer programs more understandable.
Hypergraph Contrastive Learning for both Homophilic and Heterophilic Hypergraphs
Machine Learning (CS)
Helps computers understand messy, connected information better.