Transport Reversible Jump Markov Chain Monte Carlo with proposals generated by Variational Inference with Normalizing Flows
By: Pingping Yin, Xiyun Jiao
We present a framework using variational inference with normalizing flows (VI-NFs) to generate proposals of reversible jump Markov chain Monte Carlo (RJMCMC) for efficient trans-dimensional Bayesian inference. Unlike transport reversible jump methods relying on forward KL minimization with pilot MCMC samples, our approach minimizes the reverse KL divergence which requires only samples from a base distribution, eliminating costly target sampling. The method employs RealNVP-based flows to learn model-specific transport maps, enabling construction of both between-model and within-model proposals. Our framework provides accurate marginal likelihood estimates from the variational approximation. This facilitates efficient model comparison and proposal adaptation in RJMCMC. Experiments on illustrative example, factor analysis and variable selection tasks in linear regression show that TRJ designed by VI-NFs achieves faster mixing and more efficient model space exploration compared to existing baselines. The proposed algorithm can be extended to conditional flows for amortized vairiational inference across models. Code is available at https://github.com/YinPingping111/TRJ_VINFs.
Similar Papers
Inference-time Stochastic Refinement of GRU-Normalizing Flow for Real-time Video Motion Transfer
CV and Pattern Recognition
Makes future video predictions more varied and accurate.
Adaptive Heterogeneous Mixtures of Normalising Flows for Robust Variational Inference
Machine Learning (CS)
Makes computer guesses better for tricky shapes.
SimFlow: Simplified and End-to-End Training of Latent Normalizing Flows
CV and Pattern Recognition
Makes AI create clearer pictures from less data.