Deep Neural Network Calibration by Reducing Classifier Shift with Stochastic Masking
By: Jiani Ni , He Zhao , Yibo Yang and more
Potential Business Impact:
Makes AI more sure when it's right.
In recent years, deep neural networks (DNNs) have shown competitive results in many fields. Despite this success, they often suffer from poor calibration, especially in safety-critical scenarios such as autonomous driving and healthcare, where unreliable confidence estimates can lead to serious consequences. Recent studies have focused on improving calibration by modifying the classifier, yet such efforts remain limited. Moreover, most existing approaches overlook calibration errors caused by underconfidence, which can be equally detrimental. To address these challenges, we propose MaC-Cal, a novel mask-based classifier calibration method that leverages stochastic sparsity to enhance the alignment between confidence and accuracy. MaC-Cal adopts a two-stage training scheme with adaptive sparsity, dynamically adjusting mask retention rates based on the deviation between confidence and accuracy. Extensive experiments show that MaC-Cal achieves superior calibration performance and robustness under data corruption, offering a practical and effective solution for reliable confidence estimation in DNNs.
Similar Papers
Monitoring the calibration of probability forecasts with an application to concept drift detection involving image classification
Machine Learning (Stat)
Keeps computer vision accurate over time.
Combating Noisy Labels via Dynamic Connection Masking
Machine Learning (CS)
Helps computers learn better with wrong information.
Mask to Adapt: Simple Random Masking Enables Robust Continual Test-Time Learning
CV and Pattern Recognition
Fixes computer vision when images get messy.