Hypergraph Neural Sheaf Diffusion: A Symmetric Simplicial Set Framework for Higher-Order Learning
By: Seongjin Choi, Gahee Kim, Yong-Geun Oh
Potential Business Impact:
Helps computers understand complex connections better.
The absence of intrinsic adjacency relations and orientation systems in hypergraphs creates fundamental challenges for constructing sheaf Laplacians of arbitrary degrees. We resolve these limitations through symmetric simplicial sets derived directly from hypergraphs, called symmetric simplicial lifting, which encode all possible oriented subrelations within each hyperedge as ordered tuples. This construction canonically defines adjacency via facet maps while inherently preserving hyperedge provenance. We establish that the normalized degree zero sheaf Laplacian on our symmetric simplicial lifting reduces exactly to the traditional graph normalized sheaf Laplacian when restricted to graphs, validating its mathematical consistency with prior graph-based sheaf theory. Furthermore, the induced structure preserves all structural information from the original hypergraph, ensuring that every multi-way relational detail is faithfully retained. Leveraging this framework, we introduce Hypergraph Neural Sheaf Diffusion (HNSD), the first principled extension of neural sheaf diffusion to hypergraphs. HNSD operates via normalized degree zero sheaf Laplacian over symmetric simplicial lifting, resolving orientation ambiguity and adjacency sparsity inherent to hypergraph learning. Experimental evaluations demonstrate HNSDs competitive performance across established benchmarks.
Similar Papers
Polynomial Neural Sheaf Diffusion: A Spectral Filtering Approach on Cellular Sheaves
Machine Learning (CS)
Makes AI understand complex data better and faster.
Asynchronous Nonlinear Sheaf Diffusion for Multi-Agent Coordination
Optimization and Control
Helps robots work together even with delays.
Cooperative Sheaf Neural Networks
Machine Learning (CS)
Lets computers learn from messy, connected information.