Particle Gibbs without the Gibbs bit
By: Adrien Corenflos
Potential Business Impact:
Improves how computers guess hidden patterns.
Exact parameter and trajectory inference in state-space models is typically achieved by one of two methods: particle marginal Metropolis-Hastings (PMMH) or particle Gibbs (PGibbs). PMMH is a pseudo-marginal algorithm which jointly proposes a new trajectory and parameter, and accepts or rejects both at once. PGibbs instead alternates between sampling from the trajectory, using an algorithm known as conditional sequential Monte Carlo (CSMC) and the parameter in a Hastings-within-Gibbs fashion. While particle independent Metropolis Hastings (PIMH), the parameter-free version of PMMH, is known to be statistically worse than CSMC, PGibbs can induce a slow mixing if the parameter and the state trajectory are very correlated. This has made PMMH the method of choice for many practitioners, despite theory and experiments favouring CSMC over PIMH for the parameter-free problem. In this article, we describe a formulation of PGibbs which bypasses the Gibbs step, essentially marginalizing over the trajectory distribution in a fashion similar to PMMH. This is achieved by considering the implementation of a CSMC algortihm for the state-space model integrated over the joint distribution of the current parameter and the parameter proposal. We illustrate the benefits of method on a simple example known to be challenging for PMMH.
Similar Papers
Particle Hamiltonian Monte Carlo
Computation
Helps computers understand complex patterns better.
Bouncy particle sampler with infinite exchanging parallel tempering
Machine Learning (CS)
Makes computer predictions more accurate with faster sampling.
Sequential Monte Carlo with Gaussian Mixture Approximation for Infinite-Dimensional Statistical Inverse Problems
Numerical Analysis
Finds hidden patterns in complex data faster.