On the Posterior Computation Under the Dirichlet-Laplace Prior
By: Paolo Onorati, David B. Dunson, Antonio Canale
Potential Business Impact:
Fixes computer math for better data guesses.
Modern applications routinely collect high-dimensional data, leading to statistical models having more parameters than there are samples available. A common solution is to impose sparsity in parameter estimation, often using penalized optimization methods. Bayesian approaches provide a probabilistic framework to formally quantify uncertainty through shrinkage priors. Among these, the Dirichlet-Laplace prior has attained recognition for its theoretical guarantees and wide applicability. This article identifies a critical yet overlooked issue in the implementation of Gibbs sampling algorithms for such priors. We demonstrate that ambiguities in the presentation of key algorithmic steps, while mathematically coherent, have led to widespread implementation inaccuracies that fail to target the intended posterior distribution -- a target endowed with rigorous asymptotic guarantees. Using the normal-means problem and high-dimensional linear regressions as canonical examples, we clarify these implementation pitfalls and their practical consequences and propose corrected and more efficient sampling procedures.
Similar Papers
A note on simulation methods for the Dirichlet-Laplace prior
Computation
Fixes computer math to get better results.
Spike-and-Slab Posterior Sampling in High Dimensions
Machine Learning (Stat)
Finds important data even with little information.
Total Robustness in Bayesian Nonlinear Regression for Measurement Error Problems under Model Misspecification
Methodology
Makes computer predictions more trustworthy with bad data.