ABG-NAS: Adaptive Bayesian Genetic Neural Architecture Search for Graph Representation Learning
By: Sixuan Wang , Jiao Yin , Jinli Cao and more
Potential Business Impact:
Finds best computer models for understanding connections.
Effective and efficient graph representation learning is essential for enabling critical downstream tasks, such as node classification, link prediction, and subgraph search. However, existing graph neural network (GNN) architectures often struggle to adapt to diverse and complex graph structures, limiting their ability to produce structure-aware and task-discriminative representations. To address this challenge, we propose ABG-NAS, a novel framework for automated graph neural network architecture search tailored for efficient graph representation learning. ABG-NAS encompasses three key components: a Comprehensive Architecture Search Space (CASS), an Adaptive Genetic Optimization Strategy (AGOS), and a Bayesian-Guided Tuning Module (BGTM). CASS systematically explores diverse propagation (P) and transformation (T) operations, enabling the discovery of GNN architectures capable of capturing intricate graph characteristics. AGOS dynamically balances exploration and exploitation, ensuring search efficiency and preserving solution diversity. BGTM further optimizes hyperparameters periodically, enhancing the scalability and robustness of the resulting architectures. Empirical evaluations on benchmark datasets (Cora, PubMed, Citeseer, and CoraFull) demonstrate that ABG-NAS consistently outperforms both manually designed GNNs and state-of-the-art neural architecture search (NAS) methods. These results highlight the potential of ABG-NAS to advance graph representation learning by providing scalable and adaptive solutions for diverse graph structures. Our code is publicly available at https://github.com/sserranw/ABG-NAS.
Similar Papers
NodeNAS: Node-Specific Graph Neural Architecture Search for Out-of-Distribution Generalization
Machine Learning (CS)
Helps computers learn better from different kinds of data.
Learn to Explore: Meta NAS via Bayesian Optimization Guided Graph Generation
Machine Learning (CS)
Builds smarter computer brains for new jobs faster.
Evolution Meets Diffusion: Efficient Neural Architecture Generation
Neural and Evolutionary Computing
Builds better computer brains faster, no training needed.