Stacking Variational Bayesian Monte Carlo
By: Francesco Silvestrin, Chengkun Li, Luigi Acerbi
Potential Business Impact:
Finds better answers by combining many tries.
Variational Bayesian Monte Carlo (VBMC) is a sample-efficient method for approximate Bayesian inference with computationally expensive likelihoods. While VBMC's local surrogate approach provides stable approximations, its conservative exploration strategy and limited evaluation budget can cause it to miss regions of complex posteriors. In this work, we introduce Stacking Variational Bayesian Monte Carlo (S-VBMC), a method that constructs global posterior approximations by merging independent VBMC runs through a principled and inexpensive post-processing step. Our approach leverages VBMC's mixture posterior representation and per-component evidence estimates, requiring no additional likelihood evaluations while being naturally parallelizable. We demonstrate S-VBMC's effectiveness on two synthetic problems designed to challenge VBMC's exploration capabilities and two real-world applications from computational neuroscience, showing substantial improvements in posterior approximation quality across all cases.
Similar Papers
Robust and Scalable Variational Bayes
Machine Learning (Stat)
Cleans messy data for better computer learning.
Variational bagging: a robust approach for Bayesian uncertainty quantification
Statistics Theory
Improves computer learning by better guessing unknowns.
A Scalable Variational Bayes Approach for Fitting Non-Conjugate Spatial Generalized Linear Mixed Models via Basis Expansions
Methodology
Lets computers quickly learn from big, messy data.