Score: 0

NEVO-GSPT: Population-Based Neural Network Evolution Using Inflate and Deflate Operators

Published: January 13, 2026 | arXiv ID: 2601.08657v1

By: Davide Farinati , Frederico J. J. B. Santos , Leonardo Vanneschi and more

Evolving neural network architectures is a computationally demanding process. Traditional methods often require an extensive search through large architectural spaces and offer limited understanding of how structural modifications influence model behavior. This paper introduces \gls{ngspt}, a novel Neuroevolution algorithm based on two key innovations. First, we adapt geometric semantic operators~(GSOs) from genetic programming to neural network evolution, ensuring that architectural changes produce predictable effects on network semantics within a unimodal error surface. Second, we introduce a novel operator (DGSM) that enables controlled reduction of network size, while maintaining the semantic properties of~GSOs. Unlike traditional approaches, \gls{ngspt}'s efficient evaluation mechanism, which only requires computing the semantics of newly added components, allows for efficient population-based training, resulting in a comprehensive exploration of the search space at a fraction of the computational cost. Experimental results on four regression benchmarks show that \gls{ngspt} consistently evolves compact neural networks that achieve performance comparable to or better than established methods in the literature, such as standard neural networks, SLIM-GSGP, TensorNEAT, and SLM.

Category
Computer Science:
Neural and Evolutionary Computing