Propagating Surrogate Uncertainty in Bayesian Inverse Problems
By: Andrew Gerard Roberts, Michael Dietze, Jonathan Huggins
Potential Business Impact:
Makes computer models more trustworthy with uncertainty.
Standard Bayesian inference schemes are infeasible for inverse problems with computationally expensive forward models. A common solution is to replace the model with a cheaper surrogate. To avoid overconfident conclusions, it is essential to acknowledge the surrogate approximation by propagating its uncertainty. At present, a variety of distinct uncertainty propagation methods have been suggested, with little understanding of how they vary. To fill this gap, we propose a mixture distribution termed the expected posterior (EP) as a general baseline for uncertainty-aware posterior approximation, justified by decision theoretic and modular Bayesian inference arguments. We then investigate the expected unnormalized posterior (EUP), a popular heuristic alternative, analyzing when it may deviate from the EP baseline. Our results show that this heuristic can break down when the surrogate uncertainty is highly non-uniform over the design space, as can be the case when the log-likelihood is emulated by a Gaussian process. Finally, we present the random kernel preconditioned Crank-Nicolson (RKpCN) algorithm, an approximate Markov chain Monte Carlo scheme that provides practical EP approximation in the challenging setting involving infinite-dimensional Gaussian process surrogates.
Similar Papers
Propagating Surrogate Uncertainty in Bayesian Inverse Problems
Methodology
Makes computer models more trustworthy and accurate.
Expectation-propagation for Bayesian empirical likelihood inference
Methodology
Makes computer guesses more accurate without needing exact rules.
Efficient Uncertainty Propagation in Bayesian Two-Step Procedures
Methodology
Makes complex computer predictions faster and more accurate.