Score: 1

ZeroLM: Data-Free Transformer Architecture Search for Language Models

Published: March 24, 2025 | arXiv ID: 2503.18646v1

By: Zhen-Song Chen , Hong-Wei Ding , Xian-Jia Wang and more

Potential Business Impact:

Finds best computer brains faster and cheaper.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Neural architecture search (NAS) provides a systematic framework for automating the design of neural network architectures, yet its widespread adoption is hindered by prohibitive computational requirements. Existing zero-cost proxy methods, while reducing search overhead, demonstrate inadequate performance in architecture ranking tasks, particularly for Transformer-based models where they often underperform simple parameter counting metrics. Current automated proxy discovery approaches suffer from extended search times, susceptibility to data overfitting, and structural complexity. This paper introduces a novel zero-cost proxy methodology that quantifies model capacity through efficient weight statistics computation while decomposing Transformer architectures into functionally distinct sub-modules, thereby optimizing the balance of their contributions to overall performance. Our comprehensive evaluation demonstrates the superiority of this approach, achieving a Spearman's rho of 0.76 and Kendall's tau of 0.53 on the FlexiBERT benchmark. The proposed method exhibits exceptional computational efficiency while maintaining robust performance across diverse NAS benchmark tasks, offering a practical solution for large-scale architecture search.

Country of Origin
🇨🇦 🇨🇳 Canada, China

Page Count
17 pages

Category
Computer Science:
Computation and Language