Proximal Approximate Inference in State-Space Models
By: Hany Abdulsamad, Ángel F. García-Fernández, Simo Särkkä
Potential Business Impact:
Helps computers guess hidden things better.
We present a class of algorithms for state estimation in nonlinear, non-Gaussian state-space models. Our approach is based on a variational Lagrangian formulation that casts Bayesian inference as a sequence of entropic trust-region updates subject to dynamic constraints. This framework gives rise to a family of forward-backward algorithms, whose structure is determined by the chosen factorization of the variational posterior. By focusing on Gauss--Markov approximations, we derive recursive schemes with favorable computational complexity. For general nonlinear, non-Gaussian models we close the recursions using generalized statistical linear regression and Fourier--Hermite moment matching.
Similar Papers
State Estimation for Linear Systems with Non-Gaussian Measurement Noise via Dynamic Programming
Systems and Control
Makes tracking things more accurate and faster.
Recursive Inference for Heterogeneous Multi-Output GP State-Space Models with Arbitrary Moment Matching
Machine Learning (Stat)
Teaches computers to learn how things work faster.
VIKING: Deep variational inference with stochastic projections
Machine Learning (Stat)
Makes smart computer programs more accurate and reliable.