Stability of Mean-Field Variational Inference
By: Shunan Sheng , Bohan Wu , Alberto González-Sanz and more
Potential Business Impact:
Makes computer guesses more stable and accurate.
Mean-field variational inference (MFVI) is a widely used method for approximating high-dimensional probability distributions by product measures. This paper studies the stability properties of the mean-field approximation when the target distribution varies within the class of strongly log-concave measures. We establish dimension-free Lipschitz continuity of the MFVI optimizer with respect to the target distribution, measured in the 2-Wasserstein distance, with Lipschitz constant inversely proportional to the log-concavity parameter. Under additional regularity conditions, we further show that the MFVI optimizer depends differentiably on the target potential and characterize the derivative by a partial differential equation. Methodologically, we follow a novel approach to MFVI via linearized optimal transport: the non-convex MFVI problem is lifted to a convex optimization over transport maps with a fixed base measure, enabling the use of calculus of variations and functional analysis. We discuss several applications of our results to robust Bayesian inference and empirical Bayes, including a quantitative Bernstein--von Mises theorem for MFVI, as well as to distributed stochastic control.
Similar Papers
Variational Inference for Latent Variable Models in High Dimensions
Statistics Theory
Makes computer models understand data better.
Mode Collapse of Mean-Field Variational Inference
Machine Learning (Stat)
Fixes computer guesses that get stuck on one answer.
Nearly Dimension-Independent Convergence of Mean-Field Black-Box Variational Inference
Machine Learning (Stat)
Makes computer learning faster, even with many details.