Incremental Uncertainty-aware Performance Monitoring with Active Labeling Intervention
By: Alexander Koebler , Thomas Decker , Ingo Thon and more
Potential Business Impact:
Finds when computer smarts are getting worse.
We study the problem of monitoring machine learning models under gradual distribution shifts, where circumstances change slowly over time, often leading to unnoticed yet significant declines in accuracy. To address this, we propose Incremental Uncertainty-aware Performance Monitoring (IUPM), a novel label-free method that estimates performance changes by modeling gradual shifts using optimal transport. In addition, IUPM quantifies the uncertainty in the performance prediction and introduces an active labeling procedure to restore a reliable estimate under a limited labeling budget. Our experiments show that IUPM outperforms existing performance estimation baselines in various gradual shift scenarios and that its uncertainty awareness guides label acquisition more effectively compared to other strategies.
Similar Papers
Reliably detecting model failures in deployment without labels
Machine Learning (CS)
Alerts when computer brains need to learn again.
Learning with Positive and Imperfect Unlabeled Data
Machine Learning (Stat)
Helps computers learn from messy, incomplete data.
A Unified and Stable Risk Minimization Framework for Weakly Supervised Learning with Theoretical Guarantees
Machine Learning (CS)
Teaches computers with less information.