Expectation-propagation for Bayesian empirical likelihood inference
By: Kenyon Ng, Weichang Yu, Howard D. Bondell
Potential Business Impact:
Makes computer guesses more accurate without needing exact rules.
Bayesian inference typically relies on specifying a parametric model that approximates the data-generating process. However, misspecified models can yield poor convergence rates and unreliable posterior calibration. Bayesian empirical likelihood offers a semi-parametric alternative by replacing the parametric likelihood with a profile empirical likelihood defined through moment constraints, thereby avoiding explicit distributional assumptions. Despite these advantages, Bayesian empirical likelihood faces substantial computational challenges, including the need to solve a constrained optimization problem for each likelihood evaluation and difficulties with non-convex posterior support, particularly in small-sample settings. This paper introduces a variational approach based on expectation-propagation to approximate the Bayesian empirical-likelihood posterior, balancing computational cost and accuracy without altering the target posterior via adjustments such as pseudo-observations. Empirically, we show that our approach can achieve a superior cost-accuracy trade-off relative to existing methods, including Hamiltonian Monte Carlo and variational Bayes. Theoretically, we show that the approximation and the Bayesian empirical-likelihood posterior are asymptotically equivalent.
Similar Papers
Bayesian Prediction under Moment Conditioning
Statistics Theory
Helps computers guess better with missing info.
Detecting Model Misspecification in Bayesian Inverse Problems via Variational Gradient Descent
Methodology
Finds when computer models are wrong.
Pseudo Empirical Likelihood Inference for Non-Probability Survey Samples
Methodology
Improves how we learn from surveys with missing info.