Score: 1

Convergence of a class of gradient-free optimisation schemes when the objective function is noisy, irregular, or both

Published: December 2, 2025 | arXiv ID: 2512.03225v1

By: Christophe Andrieu , Nicolas Chopin , Ettore Fincato and more

Potential Business Impact:

Improves computer learning from messy data.

Business Areas:
Analytics Data and Analytics

We investigate the convergence properties of a class of iterative algorithms designed to minimize a potentially non-smooth and noisy objective function, which may be algebraically intractable and whose values may be obtained as the output of a black box. The algorithms considered can be cast under the umbrella of a generalised gradient descent recursion, where the gradient is that of a smooth approximation of the objective function. The framework we develop includes as special cases model-based and mollification methods, two classical approaches to zero-th order optimisation. The convergence results are obtained under very weak assumptions on the regularity of the objective function and involve a trade-off between the degree of smoothing and size of the steps taken in the parameter updates. As expected, additional assumptions are required in the stochastic case. We illustrate the relevance of these algorithms and our convergence results through a challenging classification example from machine learning.

Country of Origin
🇬🇧 🇫🇷 France, United Kingdom

Page Count
25 pages

Category
Statistics:
Computation