Score: 0

Geometric Convergence Analysis of Variational Inference via Bregman Divergences

Published: October 17, 2025 | arXiv ID: 2510.15548v1

By: Sushil Bohara, Amedeo Roberto Esposito

Potential Business Impact:

Helps computers learn better by understanding math.

Business Areas:
A/B Testing Data and Analytics

Variational Inference (VI) provides a scalable framework for Bayesian inference by optimizing the Evidence Lower Bound (ELBO), but convergence analysis remains challenging due to the objective's non-convexity and non-smoothness in Euclidean space. We establish a novel theoretical framework for analyzing VI convergence by exploiting the exponential family structure of distributions. We express negative ELBO as a Bregman divergence with respect to the log-partition function, enabling a geometric analysis of the optimization landscape. We show that this Bregman representation admits a weak monotonicity property that, while weaker than convexity, provides sufficient structure for rigorous convergence analysis. By deriving bounds on the objective function along rays in parameter space, we establish properties governed by the spectral characteristics of the Fisher information matrix. Under this geometric framework, we prove non-asymptotic convergence rates for gradient descent algorithms with both constant and diminishing step sizes.

Page Count
14 pages

Category
Statistics:
Machine Learning (Stat)