Robustness and Invariance of Hybrid Metaheuristics under Objective Function Transformations
By: Grzegorz Sroka, Sławomir T. Wierzchoń
Potential Business Impact:
Makes computer problem-solvers work better everywhere.
This paper evaluates the robustness and structural invariance of hybrid population-based metaheuristics under various objective space transformations. A lightweight plug-and-play hybridization operator is applied to nineteen state-of-the-art algorithms-including differential evolution (DE), particle swarm optimization (PSO), and recent bio-inspired methods-without modifying their internal logic. Benchmarking on the CEC-2017 suite across four dimensions (10, 30, 50, 100) is performed under five transformation types: baseline, translation, scaling, rotation, and constant shift. Statistical comparisons based on Wilcoxon and Friedman tests, Bayesian dominance analysis, and convergence trajectory profiling consistently show that differential-based hybrids (e.g., hIMODE, hSHADE, hDMSSA) maintain high accuracy, stability, and invariance under all tested deformations. In contrast, classical algorithms-especially PSO- and HHO-based variants-exhibit significant performance degradation under non-separable or distorted landscapes. The findings confirm the superiority of adaptive, structurally resilient hybrids for real-world optimization tasks subject to domain-specific transformations.
Similar Papers
A Study of Hybrid and Evolutionary Metaheuristics for Single Hidden Layer Feedforward Neural Network Architecture
Neural and Evolutionary Computing
Makes computer learning faster and more accurate.
Sequential, Parallel and Consecutive Hybrid Evolutionary-Swarm Optimization Metaheuristics
Neural and Evolutionary Computing
Boosts problem-solving for tough math puzzles
Fast and robust parametric and functional learning with Hybrid Genetic Optimisation (HyGO)
Neural and Evolutionary Computing
Makes designs better, faster, and more efficient.