Universal Neural Architecture Space: Covering ConvNets, Transformers and Everything in Between
By: Ondřej Týbl, Lukáš Neumann
Potential Business Impact:
Finds best computer "brains" for any task.
We introduce Universal Neural Architecture Space (UniNAS), a generic search space for neural architecture search (NAS) which unifies convolutional networks, transformers, and their hybrid architectures under a single, flexible framework. Our approach enables discovery of novel architectures as well as analyzing existing architectures in a common framework. We also propose a new search algorithm that allows traversing the proposed search space, and demonstrate that the space contains interesting architectures, which, when using identical training setup, outperform state-of-the-art hand-crafted architectures. Finally, a unified toolkit including a standardized training and evaluation protocol is introduced to foster reproducibility and enable fair comparison in NAS research. Overall, this work opens a pathway towards systematically exploring the full spectrum of neural architectures with a unified graph-based NAS perspective.
Similar Papers
ONNX-Net: Towards Universal Representations and Instant Performance Prediction for Neural Architectures
Machine Learning (CS)
Tests computer brains instantly, no matter their design.
HyperNAS: Enhancing Architecture Representation for NAS Predictor via Hypernetwork
Machine Learning (CS)
Finds best computer brain designs faster.
NodeNAS: Node-Specific Graph Neural Architecture Search for Out-of-Distribution Generalization
Machine Learning (CS)
Helps computers learn better from different kinds of data.