Hypergraph Foundation Model
By: Yifan Feng , Shiquan Liu , Xiangmin Han and more
Potential Business Impact:
Helps computers understand complex connections better.
Hypergraph neural networks (HGNNs) effectively model complex high-order relationships in domains like protein interactions and social networks by connecting multiple vertices through hyperedges, enhancing modeling capabilities, and reducing information loss. Developing foundation models for hypergraphs is challenging due to their distinct data, which includes both vertex features and intricate structural information. We present Hyper-FM, a Hypergraph Foundation Model for multi-domain knowledge extraction, featuring Hierarchical High-Order Neighbor Guided Vertex Knowledge Embedding for vertex feature representation and Hierarchical Multi-Hypergraph Guided Structural Knowledge Extraction for structural information. Additionally, we curate 10 text-attributed hypergraph datasets to advance research between HGNNs and LLMs. Experiments on these datasets show that Hyper-FM outperforms baseline methods by approximately 13.3\%, validating our approach. Furthermore, we propose the first scaling law for hypergraph foundation models, demonstrating that increasing domain diversity significantly enhances performance, unlike merely augmenting vertex and hyperedge counts. This underscores the critical role of domain diversity in scaling hypergraph models.
Similar Papers
Recent Advances in Hypergraph Neural Networks
Machine Learning (CS)
Helps computers understand complex connections better.
HeTa: Relation-wise Heterogeneous Graph Foundation Attack Model
Artificial Intelligence
Makes computer attacks work on many different systems.
Relation-Aware Graph Foundation Model
Machine Learning (CS)
Helps computers understand connections in data better.