Bayesian Conformal Prediction via the Bayesian Bootstrap
By: Graham Gibson
Potential Business Impact:
Makes predictions' uncertainty estimates reliable
Reliable uncertainty quantification remains a central challenge in predictive modeling. While Bayesian methods are theoretically appealing, their predictive intervals can exhibit poor frequentist calibration, particularly with small sample sizes or model misspecification. We introduce a practical and broadly applicable Bayesian conformal approach based on the influence-function Bayesian bootstrap (BB) with data-driven tuning of the Dirichlet concentration parameter, {\alpha}. By efficiently approximating the Bayesian bootstrap predictive distribution via influence functions and calibrating {\alpha} to optimize empirical coverage or average log-probability, our method constructs prediction intervals and distributions that are both well-calibrated and sharp. Across a range of regression models and data settings, this Bayesian conformal framework consistently yields improved empirical coverage and log-score compared to standard Bayesian posteriors. Our procedure is fast, easy to implement, and offers a flexible approach for distributional calibration in predictive modeling.
Similar Papers
The Interplay between Bayesian Inference and Conformal Prediction
Methodology
Combines two math methods for better predictions.
Conformalized Bayesian Inference, with Applications to Random Partition Models
Methodology
Makes complex computer guesses easier to understand.
Conformal prediction without knowledge of labeled calibration data
Methodology
Lets computers guess answers with a safety net.