Fast and Scalable Score-Based Kernel Calibration Tests
By: Pierre Glaser , David Widmann , Fredrik Lindsten and more
Potential Business Impact:
Checks if computer predictions are trustworthy.
We introduce the Kernel Calibration Conditional Stein Discrepancy test (KCCSD test), a non-parametric, kernel-based test for assessing the calibration of probabilistic models with well-defined scores. In contrast to previous methods, our test avoids the need for possibly expensive expectation approximations while providing control over its type-I error. We achieve these improvements by using a new family of kernels for score-based probabilities that can be estimated without probability density samples, and by using a conditional goodness-of-fit criterion for the KCCSD test's U-statistic. We demonstrate the properties of our test on various synthetic settings.
Similar Papers
A Practical Introduction to Kernel Discrepancies: MMD, HSIC & KSD
Machine Learning (Stat)
Measures how different two groups of data are.
Kernel-Based Evaluation of Conditional Biological Sequence Models
Machine Learning (Stat)
Helps scientists check if computer models understand biology.
Conditional Score Learning for Quickest Change Detection in Markov Transition Kernels
Machine Learning (CS)
Finds hidden changes in data streams faster.