RevoNAD: Reflective Evolutionary Exploration for Neural Architecture Design
By: Gyusam Chang , Jeongyoon Yoon , Shin han yi and more
Recent progress in leveraging large language models (LLMs) has enabled Neural Architecture Design (NAD) systems to generate new architecture not limited from manually predefined search space. Nevertheless, LLM-driven generation remains challenging: the token-level design loop is discrete and non-differentiable, preventing feedback from smoothly guiding architectural improvement. These methods, in turn, commonly suffer from mode collapse into redundant structures or drift toward infeasible designs when constructive reasoning is not well grounded. We introduce RevoNAD, a reflective evolutionary orchestrator that effectively bridges LLM-based reasoning with feedback-aligned architectural search. First, RevoNAD presents a Multi-round Multi-expert Consensus to transfer isolated design rules into meaningful architectural clues. Then, Adaptive Reflective Exploration adjusts the degree of exploration leveraging reward variance; it explores when feedback is uncertain and refines when stability is reached. Finally, Pareto-guided Evolutionary Selection effectively promotes architectures that jointly optimize accuracy, efficiency, latency, confidence, and structural diversity. Across CIFAR10, CIFAR100, ImageNet16-120, COCO-5K, and Cityscape, RevoNAD achieves state-of-the-art performance. Ablation and transfer studies further validate the effectiveness of RevoNAD in allowing practically reliable, and deployable neural architecture design.
Similar Papers
Evolution Meets Diffusion: Efficient Neural Architecture Generation
Neural and Evolutionary Computing
Builds better computer brains faster, no training needed.
Reinforcement learning in densely recurrent biological networks
Neural and Evolutionary Computing
Teaches worm brains new tricks faster.
Meta knowledge assisted Evolutionary Neural Architecture Search
Neural and Evolutionary Computing
Finds best computer brains faster and cheaper.