Formal equivalence between global optimization consistency and random search
By: Gaëtan Serré
Potential Business Impact:
Proves how computers find best answers everywhere.
We formalize a proof that any stochastic and iterative global optimization algorithm is consistent over Lipschitz continuous functions if and only if it samples the whole search space. To achieve this, we use the L$\exists$$\forall$N theorem prover and the Mathlib library. The major challenge of this formalization, apart from the technical aspects of the proof itself, is to converge to a definition of a stochastic and iterative global optimization algorithm that is both general enough to encompass all algorithms of this type and specific enough to be used in a formal proof. We define such an algorithm as a pair of an initial probability measure and a sequence of Markov kernels that describe the distribution of the next point sampled by the algorithm given the previous points and their evaluations. We then construct a probability measure on finite and infinite sequences of iterations of the algorithm using the Ionescu-Tulcea theorem.
Similar Papers
A Unifying Framework for Global Optimization: From Theory to Formalization
Formal Languages and Automata Theory
Makes computer math proofs more reliable.
Stochastic Optimization with Random Search
Optimization and Control
Improves computer guessing for tricky problems.
Identity Testing for Stochastic Languages
Formal Languages and Automata Theory
Tests if computer language matches a pattern.