Score: 1

Evolving Deep Learning Optimizers

Published: December 5, 2025 | arXiv ID: 2512.11853v1

By: Mitchell Marfinetz

Potential Business Impact:

Teaches computers to learn faster and better.

Business Areas:
A/B Testing Data and Analytics

We present a genetic algorithm framework for automatically discovering deep learning optimization algorithms. Our approach encodes optimizers as genomes that specify combinations of primitive update terms (gradient, momentum, RMS normalization, Adam-style adaptive terms, and sign-based updates) along with hyperparameters and scheduling options. Through evolutionary search over 50 generations with a population of 50 individuals, evaluated across multiple vision tasks, we discover an evolved optimizer that outperforms Adam by 2.6% in aggregate fitness and achieves a 7.7% relative improvement on CIFAR-10. The evolved optimizer combines sign-based gradient terms with adaptive moment estimation, uses lower momentum coefficients than Adam ($β_1$=0.86, $β_2$=0.94), and notably disables bias correction while enabling learning rate warmup and cosine decay. Our results demonstrate that evolutionary search can discover competitive optimization algorithms and reveal design principles that differ from hand-crafted optimizers. Code is available at https://github.com/mmarfinetz/evo-optimizer.

Country of Origin
🇺🇸 United States

Repos / Data Links

Page Count
6 pages

Category
Computer Science:
Neural and Evolutionary Computing