Score: 0

Transferring Social Network Knowledge from Multiple GNN Teachers to Kolmogorov-Arnold Networks

Published: August 8, 2025 | arXiv ID: 2508.06663v1

By: Yuan-Hung Chao, Chia-Hsun Lu, Chih-Ya Shen

Potential Business Impact:

Makes AI learn faster without needing all the connections.

Graph Neural Networks (GNNs) have shown strong performance on graph-structured data, but their reliance on graph connectivity often limits scalability and efficiency. Kolmogorov-Arnold Networks (KANs), a recent architecture with learnable univariate functions, offer strong nonlinear expressiveness and efficient inference. In this work, we integrate KANs into three popular GNN architectures-GAT, SGC, and APPNP-resulting in three new models: KGAT, KSGC, and KAPPNP. We further adopt a multi-teacher knowledge amalgamation framework, where knowledge from multiple KAN-based GNNs is distilled into a graph-independent KAN student model. Experiments on benchmark datasets show that the proposed models improve node classification accuracy, and the knowledge amalgamation approach significantly boosts student model performance. Our findings highlight the potential of KANs for enhancing GNN expressiveness and for enabling efficient, graph-free inference.

Country of Origin
🇹🇼 Taiwan, Province of China

Page Count
6 pages

Category
Computer Science:
Machine Learning (CS)