Score: 0

Stacking Variational Bayesian Monte Carlo

Published: April 7, 2025 | arXiv ID: 2504.05004v2

By: Francesco Silvestrin, Chengkun Li, Luigi Acerbi

Potential Business Impact:

Finds better answers by combining many tries.

Business Areas:
A/B Testing Data and Analytics

Variational Bayesian Monte Carlo (VBMC) is a sample-efficient method for approximate Bayesian inference with computationally expensive likelihoods. While VBMC's local surrogate approach provides stable approximations, its conservative exploration strategy and limited evaluation budget can cause it to miss regions of complex posteriors. In this work, we introduce Stacking Variational Bayesian Monte Carlo (S-VBMC), a method that constructs global posterior approximations by merging independent VBMC runs through a principled and inexpensive post-processing step. Our approach leverages VBMC's mixture posterior representation and per-component evidence estimates, requiring no additional likelihood evaluations while being naturally parallelizable. We demonstrate S-VBMC's effectiveness on two synthetic problems designed to challenge VBMC's exploration capabilities and two real-world applications from computational neuroscience, showing substantial improvements in posterior approximation quality across all cases.

Country of Origin
🇫🇮 Finland

Page Count
24 pages

Category
Statistics:
Machine Learning (Stat)