Robust Semiparametric Inference for Bayesian Additive Regression Trees
By: Christoph Breunig, Ruixuan Liu, Zhengfei Yu
Potential Business Impact:
Fixes computer predictions when some information is missing.
We develop a semiparametric framework for inference on the mean response in missing-data settings using a corrected posterior distribution. Our approach is tailored to Bayesian Additive Regression Trees (BART), which is a powerful predictive method but whose nonsmoothness complicate asymptotic theory with multi-dimensional covariates. When using BART combined with Bayesian bootstrap weights, we establish a new Bernstein-von Mises theorem and show that the limit distribution generally contains a bias term. To address this, we introduce RoBART, a posterior bias-correction that robustifies BART for valid inference on the mean response. Monte Carlo studies support our theory, demonstrating reduced bias and improved coverage relative to existing procedures using BART.
Similar Papers
Robust Semiparametric Inference for Bayesian Additive Regression Trees
Methodology
Fixes computer predictions when some information is missing.
An Infinite BART model
Computation
Lets computers learn better from different data.
Bayesian Additive Regression Trees for functional ANOVA model
Machine Learning (Stat)
Shows how different things affect results.