Uncertainty Quantification for Regression: A Unified Framework based on kernel scores
By: Christopher Bülte , Yusuf Sale , Gitta Kutyniok and more
Potential Business Impact:
Helps computers know when they are unsure.
Regression tasks, notably in safety-critical domains, require proper uncertainty quantification, yet the literature remains largely classification-focused. In this light, we introduce a family of measures for total, aleatoric, and epistemic uncertainty based on proper scoring rules, with a particular emphasis on kernel scores. The framework unifies several well-known measures and provides a principled recipe for designing new ones whose behavior, such as tail sensitivity, robustness, and out-of-distribution responsiveness, is governed by the choice of kernel. We prove explicit correspondences between kernel-score characteristics and downstream behavior, yielding concrete design guidelines for task-specific measures. Extensive experiments demonstrate that these measures are effective in downstream tasks and reveal clear trade-offs among instantiations, including robustness and out-of-distribution detection performance.
Similar Papers
Uncertainty Quantification for Machine Learning: One Size Does Not Fit All
Machine Learning (CS)
Chooses best way to measure computer guesses.
A Novel Framework for Uncertainty Quantification via Proper Scores for Classification and Beyond
Machine Learning (CS)
Makes AI more sure about its answers.
An Axiomatic Assessment of Entropy- and Variance-based Uncertainty Quantification in Regression
Machine Learning (CS)
Makes computer predictions more trustworthy and honest.