Debiased Bayesian Inference for High-dimensional Regression Models
By: Qihui Chen, Zheng Fang, Ruixuan Liu
Potential Business Impact:
Fixes math models so they are more trustworthy.
There has been significant progress in Bayesian inference based on sparsity-inducing (e.g., spike-and-slab and horseshoe-type) priors for high-dimensional regression models. The resulting posteriors, however, in general do not possess desirable frequentist properties, and the credible sets thus cannot serve as valid confidence sets even asymptotically. We introduce a novel debiasing approach that corrects the bias for the entire Bayesian posterior distribution. We establish a new Bernstein-von Mises theorem that guarantees the frequentist validity of the debiased posterior. We demonstrate the practical performance of our proposal through Monte Carlo simulations and two empirical applications in economics.
Similar Papers
Robust Semiparametric Inference for Bayesian Additive Regression Trees
Methodology
Fixes computer predictions when some information is missing.
Bayesian computation for high-dimensional Gaussian Graphical Models with spike-and-slab priors
Methodology
Find hidden connections in lots of data faster.
Generalized Bayes in Conditional Moment Restriction Models
Econometrics
Helps economists understand how companies make things.