Wasserstein Evolution : Evolutionary Optimization as Phase Transition
By: Kaichen Ouyang
This paper establishes a novel connection between evolutionary computation and statistical physics by formalizing evolutionary optimization as a phase transition process. We introduce Wasserstein Evolution (WE), a principled optimization framework that implements the Wasserstein gradient flow of a free energy functional, mathematically bridging evolutionary dynamics with thermodynamics. WE directly translates the physical competition between potential gradient forces (exploitation) and entropic forces (exploration) into algorithmic dynamics, providing an adaptive, theoretically grounded mechanism for balancing exploration and exploitation. Experiments on challenging benchmark functions demonstrate that WE achieves competitive convergence performance while maintaining dramatically higher population diversity than classical methods (GA, DE, CMA-ES).This superior entropy preservation enables effective navigation of multi-modal landscapes without premature convergence, validating the physical interpretation of optimization as a disorder-to-order transition. Our work provides not only an effective optimization algorithm but also a new paradigm for understanding evolutionary computation through statistical physics.
Similar Papers
WeightFlow: Learning Stochastic Dynamics via Evolving Weight of Neural Network
Computational Engineering, Finance, and Science
Helps computers understand how things change over time.
Towards Evolutionary Optimization Using the Ising Model
Neural and Evolutionary Computing
Finds the best answer in tricky problems.
The Evolution of Learning Algorithms for Artificial Neural Networks
Neural and Evolutionary Computing
Evolves computer brains to learn like us.