Adaptive Conformal Prediction via Bayesian Uncertainty Weighting for Hierarchical Healthcare Data
By: Marzieh Amiri Shahbazi, Ali Baheri, Nasibeh Azadeh-Fard
Potential Business Impact:
Helps doctors make safer, smarter patient choices.
Clinical decision-making demands uncertainty quantification that provides both distribution-free coverage guarantees and risk-adaptive precision, requirements that existing methods fail to jointly satisfy. We present a hybrid Bayesian-conformal framework that addresses this fundamental limitation in healthcare predictions. Our approach integrates Bayesian hierarchical random forests with group-aware conformal calibration, using posterior uncertainties to weight conformity scores while maintaining rigorous coverage validity. Evaluated on 61,538 admissions across 3,793 U.S. hospitals and 4 regions, our method achieves target coverage (94.3% vs 95% target) with adaptive precision: 21% narrower intervals for low-uncertainty cases while appropriately widening for high-risk predictions. Critically, we demonstrate that well-calibrated Bayesian uncertainties alone severely under-cover (14.1%), highlighting the necessity of our hybrid approach. This framework enables risk-stratified clinical protocols, efficient resource planning for high-confidence predictions, and conservative allocation with enhanced oversight for uncertain cases, providing uncertainty-aware decision support across diverse healthcare settings.
Similar Papers
Weighted Conformal Prediction for Survival Analysis under Covariate Shift
Methodology
Makes medical predictions more trustworthy.
Reliable Statistical Guarantees for Conformal Predictors with Small Datasets
Machine Learning (CS)
Makes AI smarter and safer with less data.
Reliable Statistical Guarantees for Conformal Predictors with Small Datasets
Machine Learning (CS)
Makes AI predictions more trustworthy, even with little data.