Vector Copula Variational Inference and Dependent Block Posterior Approximations
By: Yu Fu, Michael Stanley Smith, Anastasios Panagiotelis
Potential Business Impact:
Makes computer models more accurate with less guessing.
Variational inference (VI) is a popular method to estimate statistical and econometric models. The key to VI is the selection of a tractable density to approximate the Bayesian posterior. For large and complex models a common choice is to assume independence between multivariate blocks in a partition of the parameter space. While this simplifies the problem it can reduce accuracy. This paper proposes using vector copulas to capture dependence between the blocks parsimoniously. Tailored multivariate marginals are constructed using learnable cyclically monotone transformations. We call the resulting joint distribution a ``dependent block posterior'' approximation. Vector copula models are suggested that make tractable and flexible variational approximations. They allow for differing marginals, numbers of blocks, block sizes and forms of between block dependence. They also allow for solution of the variational optimization using fast and efficient stochastic gradient methods. The efficacy and versatility of the approach is demonstrated using four different statistical models and 16 datasets which have posteriors that are challenging to approximate. In all cases, our method produces more accurate posterior approximations than benchmark VI methods that either assume block independence or factor-based dependence, at limited additional computational cost.
Similar Papers
Variational Inference with Mixtures of Isotropic Gaussians
Machine Learning (Stat)
Finds better computer guesses for complex problems.
Variational Inference for Latent Variable Models in High Dimensions
Statistics Theory
Makes computer models understand data better.
Bayesian nonparametric copulas with tail dependence
Methodology
Predicts when bad things happen together.