Nearly Dimension-Independent Convergence of Mean-Field Black-Box Variational Inference
By: Kyurae Kim , Yi-An Ma , Trevor Campbell and more
Potential Business Impact:
Makes computer learning faster, even with many details.
We prove that, given a mean-field location-scale variational family, black-box variational inference (BBVI) with the reparametrization gradient converges at an almost dimension-independent rate. Specifically, for strongly log-concave and log-smooth targets, the number of iterations for BBVI with a sub-Gaussian family to achieve an objective $\epsilon$-close to the global optimum is $\mathrm{O}(\log d)$, which improves over the $\mathrm{O}(d)$ dependence of full-rank location-scale families. For heavy-tailed families, we provide a weaker $\mathrm{O}(d^{2/k})$ dimension dependence, where $k$ is the number of finite moments. Additionally, if the Hessian of the target log-density is constant, the complexity is free of any explicit dimension dependence. We also prove that our bound on the gradient variance, which is key to our result, cannot be improved using only spectral bounds on the Hessian of the target log-density.
Similar Papers
Stability of Mean-Field Variational Inference
Probability
Makes computer guesses more stable and accurate.
Geometric Convergence Analysis of Variational Inference via Bregman Divergences
Machine Learning (Stat)
Helps computers learn better by understanding math.
Globally Convergent Variational Inference
Machine Learning (Stat)
Makes AI learning more reliable and accurate.