Nonlinear Bayesian Update via Ensemble Kernel Regression with Clustering and Subsampling
By: Yoonsang Lee
Potential Business Impact:
Improves predictions when things change in weird ways.
Nonlinear Bayesian update for a prior ensemble is proposed to extend traditional ensemble Kalman filtering to settings characterized by non-Gaussian priors and nonlinear measurement operators. In this framework, the observed component is first denoised via a standard Kalman update, while the unobserved component is estimated using a nonlinear regression approach based on kernel density estimation. The method incorporates a subsampling strategy to ensure stability and, when necessary, employs unsupervised clustering to refine the conditional estimate. Numerical experiments on Lorenz systems and a PDE-constrained inverse problem illustrate that the proposed nonlinear update can reduce estimation errors compared to standard linear updates, especially in highly nonlinear scenarios.
Similar Papers
A geometric ensemble method for Bayesian inference
Optimization and Control
Makes computers guess better about hidden things.
Bayesian Nonparametric Dynamical Clustering of Time Series
Machine Learning (Stat)
Finds hidden patterns in heartbeats over time.
Adaptive Bayesian Optimization for Robust Identification of Stochastic Dynamical Systems
Machine Learning (Stat)
Finds hidden patterns in changing systems better.