Score: 0

Stability of Mean-Field Variational Inference

Published: June 9, 2025 | arXiv ID: 2506.07856v1

By: Shunan Sheng , Bohan Wu , Alberto González-Sanz and more

Potential Business Impact:

Makes computer guesses more stable and accurate.

Business Areas:
A/B Testing Data and Analytics

Mean-field variational inference (MFVI) is a widely used method for approximating high-dimensional probability distributions by product measures. This paper studies the stability properties of the mean-field approximation when the target distribution varies within the class of strongly log-concave measures. We establish dimension-free Lipschitz continuity of the MFVI optimizer with respect to the target distribution, measured in the 2-Wasserstein distance, with Lipschitz constant inversely proportional to the log-concavity parameter. Under additional regularity conditions, we further show that the MFVI optimizer depends differentiably on the target potential and characterize the derivative by a partial differential equation. Methodologically, we follow a novel approach to MFVI via linearized optimal transport: the non-convex MFVI problem is lifted to a convex optimization over transport maps with a fixed base measure, enabling the use of calculus of variations and functional analysis. We discuss several applications of our results to robust Bayesian inference and empirical Bayes, including a quantitative Bernstein--von Mises theorem for MFVI, as well as to distributed stochastic control.

Country of Origin
🇺🇸 United States

Page Count
43 pages

Category
Mathematics:
Probability