On the Interplay between Graph Structure and Learning Algorithms in Graph Neural Networks
By: Junwei Su, Chuan Wu
Potential Business Impact:
Helps computers learn better from connected data.
This paper studies the interplay between learning algorithms and graph structure for graph neural networks (GNNs). Existing theoretical studies on the learning dynamics of GNNs primarily focus on the convergence rates of learning algorithms under the interpolation regime (noise-free) and offer only a crude connection between these dynamics and the actual graph structure (e.g., maximum degree). This paper aims to bridge this gap by investigating the excessive risk (generalization performance) of learning algorithms in GNNs within the generalization regime (with noise). Specifically, we extend the conventional settings from the learning theory literature to the context of GNNs and examine how graph structure influences the performance of learning algorithms such as stochastic gradient descent (SGD) and Ridge regression. Our study makes several key contributions toward understanding the interplay between graph structure and learning in GNNs. First, we derive the excess risk profiles of SGD and Ridge regression in GNNs and connect these profiles to the graph structure through spectral graph theory. With this established framework, we further explore how different graph structures (regular vs. power-law) impact the performance of these algorithms through comparative analysis. Additionally, we extend our analysis to multi-layer linear GNNs, revealing an increasing non-isotropic effect on the excess risk profile, thereby offering new insights into the over-smoothing issue in GNNs from the perspective of learning algorithms. Our empirical results align with our theoretical predictions, \emph{collectively showcasing a coupling relation among graph structure, GNNs and learning algorithms, and providing insights on GNN algorithm design and selection in practice.}
Similar Papers
Learning the Structure of Connection Graphs
Machine Learning (CS)
Finds hidden patterns in connected data.
Leveraging Classical Algorithms for Graph Neural Networks
Machine Learning (CS)
Teaches computers to predict drug effects better.
A Derandomization Framework for Structure Discovery: Applications in Neural Networks and Beyond
Machine Learning (Stat)
Helps computers learn better with less data.