Variational Bernstein-von Mises theorem with increasing parameter dimension
By: Jiawei Yan, Peirong Xu, Tao Wang
Potential Business Impact:
Proves fast stats shortcuts work for big data
Variational Bayes (VB) provides a computationally efficient alternative to Markov Chain Monte Carlo, especially for high-dimensional and large-scale inference. However, existing theory on VB primarily focuses on fixed-dimensional settings or specific models. To address this limitation, this paper develops a finite-sample theory for VB in a broad class of parametric models with latent variables. We establish theoretical properties of the VB posterior, including a non-asymptotic variational Bernstein--von Mises theorem. Furthermore, we derive consistency and asymptotic normality of the VB estimator. An application to multivariate Gaussian mixture models is presented for illustration.
Similar Papers
Generalized Bayes in Conditional Moment Restriction Models
Econometrics
Helps economists understand how companies make things.
Robust and Scalable Variational Bayes
Machine Learning (Stat)
Cleans messy data for better computer learning.
Variational bagging: a robust approach for Bayesian uncertainty quantification
Statistics Theory
Improves computer learning by better guessing unknowns.