BLISS: Bandit Layer Importance Sampling Strategy for Efficient Training of Graph Neural Networks
By: Omar Alsaqa, Linh Thi Hoang, Muhammed Fatih Balin
Graph Neural Networks (GNNs) are powerful tools for learning from graph-structured data, but their application to large graphs is hindered by computational costs. The need to process every neighbor for each node creates memory and computational bottlenecks. To address this, we introduce BLISS, a Bandit Layer Importance Sampling Strategy. It uses multi-armed bandits to dynamically select the most informative nodes at each layer, balancing exploration and exploitation to ensure comprehensive graph coverage. Unlike existing static sampling methods, BLISS adapts to evolving node importance, leading to more informed node selection and improved performance. It demonstrates versatility by integrating with both Graph Convolutional Networks (GCNs) and Graph Attention Networks (GATs), adapting its selection policy to their specific aggregation mechanisms. Experiments show that BLISS maintains or exceeds the accuracy of full-batch training.
Similar Papers
Going with the Flow: Approximating Banzhaf Values via Graph Neural Networks
Machine Learning (CS)
Helps computers quickly guess who's important.
Going with the Flow: Approximating Banzhaf Values via Graph Neural Networks
Machine Learning (CS)
Figures out who's important in computer networks.
A Distributed Training Architecture For Combinatorial Optimization
Machine Learning (CS)
Solves hard problems on huge networks faster.