Score: 0

Stochastic Control Methods for Optimization

Published: January 3, 2026 | arXiv ID: 2601.01248v1

By: Jinniao Qiu

Potential Business Impact:

Solves hard math problems using smart guessing.

Business Areas:
Risk Management Professional Services

In this work, we investigate a stochastic control framework for global optimization over both finite-dimensional Euclidean spaces and the Wasserstein space of probability measures. In the Euclidean setting, the original minimization problem is approximated by a family of regularized stochastic control problems; using dynamic programming, we analyze the associated Hamilton--Jacobi--Bellman equations and obtain tractable representations via the Cole--Hopf transform and the Feynman--Kac formula. For optimization over probability measures, we formulate a regularized mean-field control problem characterized by a master equation, and further approximate it by controlled $N$-particle systems. We establish that, as the regularization parameter tends to zero (and as the particle number tends to infinity for the optimization over probability measures), the value of the control problem converges to the global minimum of the original objective. Building on the resulting probabilistic representations, Monte Carlo-based numerical schemes are proposed and numerical experiments are reported to illustrate the practical performance of the methods and to support the theoretical convergence rates.

Country of Origin
🇨🇦 Canada

Page Count
34 pages

Category
Mathematics:
Optimization and Control