Soft Quality-Diversity Optimization
By: Saeed Hedayatian, Stefanos Nikolaidis
Potential Business Impact:
Finds many good answers, even for hard problems.
Quality-Diversity (QD) algorithms constitute a branch of optimization that is concerned with discovering a diverse and high-quality set of solutions to an optimization problem. Current QD methods commonly maintain diversity by dividing the behavior space into discrete regions, ensuring that solutions are distributed across different parts of the space. The QD problem is then solved by searching for the best solution in each region. This approach to QD optimization poses challenges in large solution spaces, where storing many solutions is impractical, and in high-dimensional behavior spaces, where discretization becomes ineffective due to the curse of dimensionality. We present an alternative framing of the QD problem, called \emph{Soft QD}, that sidesteps the need for discretizations. We validate this formulation by demonstrating its desirable properties, such as monotonicity, and by relating its limiting behavior to the widely used QD Score metric. Furthermore, we leverage it to derive a novel differentiable QD algorithm, \emph{Soft QD Using Approximated Diversity (SQUAD)}, and demonstrate empirically that it is competitive with current state of the art methods on standard benchmarks while offering better scalability to higher dimensional problems.
Similar Papers
Multi-Objective Quality-Diversity in Unstructured and Unbounded Spaces
Machine Learning (CS)
Finds many good solutions in tricky, unknown problems.
AutoQD: Automatic Discovery of Diverse Behaviors with Quality-Diversity Optimization
Machine Learning (CS)
Finds new ways for robots to learn tasks.
Overcoming Deceptiveness in Fitness Optimization with Unsupervised Quality-Diversity
Neural and Evolutionary Computing
Finds best robot moves even in tricky situations.