Transferring Social Network Knowledge from Multiple GNN Teachers to Kolmogorov-Arnold Networks
By: Yuan-Hung Chao, Chia-Hsun Lu, Chih-Ya Shen
Potential Business Impact:
Makes AI learn faster without needing all the connections.
Graph Neural Networks (GNNs) have shown strong performance on graph-structured data, but their reliance on graph connectivity often limits scalability and efficiency. Kolmogorov-Arnold Networks (KANs), a recent architecture with learnable univariate functions, offer strong nonlinear expressiveness and efficient inference. In this work, we integrate KANs into three popular GNN architectures-GAT, SGC, and APPNP-resulting in three new models: KGAT, KSGC, and KAPPNP. We further adopt a multi-teacher knowledge amalgamation framework, where knowledge from multiple KAN-based GNNs is distilled into a graph-independent KAN student model. Experiments on benchmark datasets show that the proposed models improve node classification accuracy, and the knowledge amalgamation approach significantly boosts student model performance. Our findings highlight the potential of KANs for enhancing GNN expressiveness and for enabling efficient, graph-free inference.
Similar Papers
A Practitioner's Guide to Kolmogorov-Arnold Networks
Machine Learning (CS)
Makes computer learning smarter and easier to understand.
Enhancing Federated Learning with Kolmogorov-Arnold Networks: A Comparative Study Across Diverse Aggregation Strategies
Machine Learning (CS)
Makes AI learn better from many computers.
Adaptive graph Kolmogorov-Arnold network for 3D human pose estimation
CV and Pattern Recognition
Helps computers guess body poses from pictures.