Generative Bayesian Filtering and Parameter Learning
By: Edoardo Marcelli, Sean O'Hagan, Veronika Rockova
Potential Business Impact:
Helps computers learn from messy, unclear data.
Generative Bayesian Filtering (GBF) provides a powerful and flexible framework for performing posterior inference in complex nonlinear and non-Gaussian state-space models. Our approach extends Generative Bayesian Computation (GBC) to dynamic settings, enabling recursive posterior inference using simulation-based methods powered by deep neural networks. GBF does not require explicit density evaluations, making it particularly effective when observation or transition distributions are analytically intractable. To address parameter learning, we introduce the Generative-Gibbs sampler, which bypasses explicit density evaluation by iteratively sampling each variable from its implicit full conditional distribution. Such technique is broadly applicable and enables inference in hierarchical Bayesian models with intractable densities, including state-space models. We assess the performance of the proposed methodologies through both simulated and empirical studies, including the estimation of $\alpha$-stable stochastic volatility models. Our findings indicate that GBF significantly outperforms existing likelihood-free approaches in accuracy and robustness when dealing with intractable state-space models.
Similar Papers
Flow-based Bayesian filtering for high-dimensional nonlinear stochastic dynamical systems
Numerical Analysis
Helps computers guess hidden things better and faster.
Backward Filtering Forward Guiding
Methodology
Helps track hidden paths in complex systems.
Generator Based Inference (GBI)
High Energy Physics - Phenomenology
Finds hidden patterns in science data better.