Uncertainty Quantification for Deep Regression using Contextualised Normalizing Flows
By: Adriel Sosa Marco , John Daniel Kirwan , Alexia Toumpa and more
Potential Business Impact:
Shows how sure a computer is about its guesses.
Quantifying uncertainty in deep regression models is important both for understanding the confidence of the model and for safe decision-making in high-risk domains. Existing approaches that yield prediction intervals overlook distributional information, neglecting the effect of multimodal or asymmetric distributions on decision-making. Similarly, full or approximated Bayesian methods, while yielding the predictive posterior density, demand major modifications to the model architecture and retraining. We introduce MCNF, a novel post hoc uncertainty quantification method that produces both prediction intervals and the full conditioned predictive distribution. MCNF operates on top of the underlying trained predictive model; thus, no predictive model retraining is needed. We provide experimental evidence that the MCNF-based uncertainty estimate is well calibrated, is competitive with state-of-the-art uncertainty quantification methods, and provides richer information for downstream decision-making tasks.
Similar Papers
Contrastive Normalizing Flows for Uncertainty-Aware Parameter Estimation
Data Analysis, Statistics and Probability
Finds hidden clues in science data.
Analyzing Uncertainty Quantification in Statistical and Deep Learning Models for Probabilistic Electricity Price Forecasting
Machine Learning (CS)
Predicts electricity prices more accurately, even when unsure.
Analyzing Uncertainty Quantification in Statistical and Deep Learning Models for Probabilistic Electricity Price Forecasting
Machine Learning (CS)
Predicts electricity prices more accurately, even with unknowns.