Score: 0

Higher-Order Regularization Learning on Hypergraphs

Published: October 30, 2025 | arXiv ID: 2510.26533v1

By: Adrien Weihs, Andrea Bertozzi, Matthew Thorpe

Potential Business Impact:

Teaches computers to learn from complex connections.

Business Areas:
Machine Learning Artificial Intelligence, Data and Analytics, Software

Higher-Order Hypergraph Learning (HOHL) was recently introduced as a principled alternative to classical hypergraph regularization, enforcing higher-order smoothness via powers of multiscale Laplacians induced by the hypergraph structure. Prior work established the well- and ill-posedness of HOHL through an asymptotic consistency analysis in geometric settings. We extend this theoretical foundation by proving the consistency of a truncated version of HOHL and deriving explicit convergence rates when HOHL is used as a regularizer in fully supervised learning. We further demonstrate its strong empirical performance in active learning and in datasets lacking an underlying geometric structure, highlighting HOHL's versatility and robustness across diverse learning settings.

Page Count
43 pages

Category
Computer Science:
Machine Learning (CS)