Score: 1

Bayesian computation for high-dimensional Gaussian Graphical Models with spike-and-slab priors

Published: October 22, 2025 | arXiv ID: 2511.01875v1

By: Deborah Sulem, Jack Jewson, David Rossell

Potential Business Impact:

Find hidden connections in lots of data faster.

Business Areas:
A/B Testing Data and Analytics

Gaussian graphical models are widely used to infer dependence structures. Bayesian methods are appealing to quantify uncertainty associated with structural learning, i.e., the plausibility of conditional independence statements given the data, and parameter estimates. However, computational demands have limited their application when the number of variables is large, which prompted the use of pseudo-Bayesian approaches. We propose fully Bayesian algorithms that provably scale to high dimensions when the data-generating precision matrix is sparse, at a similar cost to the best pseudo-Bayesian methods. First, a Metropolis-Hastings-within-Block-Gibbs algorithm that allows row-wise updates of the precision matrix, using local moves. Second, a global proposal that enables adding or removing multiple edges within a row, which can help explore multi-modal posteriors. We obtain spectral gap bounds for both samplers that are dimension-free under suitable settings. We also provide worst-case polynomial bounds on per-iteration costs, though in practice the cost is lower by using sparse linear algebra. Our examples show that the methods extend the applicability of exact Bayesian inference from roughly 100 to roughly 1000 variables (equivalently, from 5,000 edges to 500,000 edges).

Repos / Data Links

Page Count
139 pages

Category
Statistics:
Methodology