Robust variational neural posterior estimation for simulation-based inference
By: Matthew O'Callaghan, Kaisey S. Mandel, Gerry Gilmore
Potential Business Impact:
Fixes computer models that don't match real life.
Recent advances in neural density estimation have enabled powerful simulation-based inference (SBI) methods that can flexibly approximate Bayesian inference for intractable stochastic models. Although these methods have demonstrated reliable posterior estimation when the simulator accurately represents the underlying data generative process (GDP), recent work has shown that they perform poorly in the presence of model misspecification. This poses a significant problem for their use on real-world problems, due to simulators always misrepresenting the true DGP to a certain degree. In this paper, we introduce robust variational neural posterior estimation (RVNP), a method which addresses the problem of misspecification in amortised SBI by bridging the simulation-to-reality gap using variational inference and error modelling. We test RVNP on multiple benchmark tasks, including using real data from astronomy, and show that it can recover robust posterior inference in a data-driven manner without adopting tunable hyperparameters or priors governing the misspecification.
Similar Papers
Quantification of Uncertainties in Probabilistic Deep Neural Network by Implementing Boosting of Variational Inference
Machine Learning (CS)
Makes AI smarter and more sure of answers.
Robust Bayesian methods using amortized simulation-based inference
Methodology
Helps computers learn better when rules are a bit fuzzy.
Robust Simulation-Based Inference under Missing Data via Neural Processes
Machine Learning (CS)
Fixes broken data for smarter computer guesses.