Learn to Explore: Meta NAS via Bayesian Optimization Guided Graph Generation
By: Zijun Sun, Yanning Shen
Potential Business Impact:
Builds smarter computer brains for new jobs faster.
Neural Architecture Search (NAS) automates the design of high-performing neural networks but typically targets a single predefined task, thereby restricting its real-world applicability. To address this, Meta Neural Architecture Search (Meta-NAS) has emerged as a promising paradigm that leverages prior knowledge across tasks to enable rapid adaptation to new ones. Nevertheless, existing Meta-NAS methods often struggle with poor generalization, limited search spaces, or high computational costs. In this paper, we propose a novel Meta-NAS framework, GraB-NAS. Specifically, GraB-NAS first models neural architectures as graphs, and then a hybrid search strategy is developed to find and generate new graphs that lead to promising neural architectures. The search strategy combines global architecture search via Bayesian Optimization in the search space with local exploration for novel neural networks via gradient ascent in the latent space. Such a hybrid search strategy allows GraB-NAS to discover task-aware architectures with strong performance, even beyond the predefined search space. Extensive experiments demonstrate that GraB-NAS outperforms state-of-the-art Meta-NAS baselines, achieving better generalization and search effectiveness.
Similar Papers
Federated Neural Architecture Search with Model-Agnostic Meta Learning
Machine Learning (CS)
Finds best AI designs faster for everyone.
Efficient Global Neural Architecture Search
CV and Pattern Recognition
Finds best computer brain designs faster.
ABG-NAS: Adaptive Bayesian Genetic Neural Architecture Search for Graph Representation Learning
Machine Learning (CS)
Finds best computer models for understanding connections.