Towards practical PDMP sampling: Metropolis adjustments, locally adaptive step-sizes, and NUTS-based time lengths
By: Augustin Chevallier, Sam Power, Matthew Sutton
Potential Business Impact:
Makes computer guessing of tricky patterns faster.
Piecewise-Deterministic Markov Processes (PDMPs) hold significant promise for sampling from complex probability distributions. However, their practical implementation is hindered by the need to compute model-specific bounds. Conversely, while Hamiltonian Monte Carlo (HMC) offers a generally efficient approach to sampling, its inability to adaptively tune step sizes impedes its performance when sampling complex distributions like funnels. To address these limitations, we introduce three innovative concepts: (a) a Metropolis-adjusted approximation for PDMP simulation that eliminates the need for explicit bounds without compromising the invariant measure, (b) an adaptive step size mechanism compatible with the Metropolis correction, and (c) a No U-Turn Sampler (NUTS)-inspired scheme for dynamically selecting path lengths in PDMPs. These three ideas can be seamlessly integrated into a single, `doubly-adaptive' PDMP sampler with favourable robustness and efficiency properties.
Similar Papers
Metropolis Adjusted Microcanonical Hamiltonian Monte Carlo
Computation
Makes computer models run much faster and better.
Covariance-Adaptive Bouncy Particle Samplers via Split Lagrangian Dynamics
Computation
Helps computers learn faster by changing how they move.
Piecewise Deterministic Sampling for Constrained Distributions
Computation
Helps computers learn from data with rules.