Score: 0

Generative Bayesian Filtering and Parameter Learning

Published: November 6, 2025 | arXiv ID: 2511.04552v1

By: Edoardo Marcelli, Sean O'Hagan, Veronika Rockova

Potential Business Impact:

Helps computers learn from messy, unclear data.

Business Areas:
A/B Testing Data and Analytics

Generative Bayesian Filtering (GBF) provides a powerful and flexible framework for performing posterior inference in complex nonlinear and non-Gaussian state-space models. Our approach extends Generative Bayesian Computation (GBC) to dynamic settings, enabling recursive posterior inference using simulation-based methods powered by deep neural networks. GBF does not require explicit density evaluations, making it particularly effective when observation or transition distributions are analytically intractable. To address parameter learning, we introduce the Generative-Gibbs sampler, which bypasses explicit density evaluation by iteratively sampling each variable from its implicit full conditional distribution. Such technique is broadly applicable and enables inference in hierarchical Bayesian models with intractable densities, including state-space models. We assess the performance of the proposed methodologies through both simulated and empirical studies, including the estimation of $\alpha$-stable stochastic volatility models. Our findings indicate that GBF significantly outperforms existing likelihood-free approaches in accuracy and robustness when dealing with intractable state-space models.

Page Count
57 pages

Category
Statistics:
Methodology