A geometric ensemble method for Bayesian inference
By: Andrey A Popov
Potential Business Impact:
Makes computers guess better about hidden things.
Conventional approximations to Bayesian inference rely on either approximations by statistics such as mean and covariance or by point particles. Recent advances such as the ensemble Gaussian mixture filter have generalized these notions to sums of parameterized distributions. This work presents a new methodology for approximating Bayesian inference by sums of uniform distributions on convex polytopes. The methodology presented herein is developed from the simplest convex polytope filter that takes advantage of uniform prior and measurement uncertainty, to an operationally viable ensemble filter with Kalmanized approximations to updating convex polytopes. Numerical results on the Ikeda map show the viability of this methodology in the low-dimensional setting, and numerical results on the Lorenz '96 equations similarly show viability in the high-dimensional setting.
Similar Papers
Statistical accuracy of the ensemble Kalman filter in the near-linear setting
Statistics Theory
Helps predict weather and diseases better.
Nonlinear Bayesian Update via Ensemble Kernel Regression with Clustering and Subsampling
Machine Learning (Stat)
Improves predictions when things change in weird ways.
Robust Inference for Convex Pairwise Difference Estimators
Econometrics
Makes computer predictions more accurate with less data.