Towards Integrating Uncertainty for Domain-Agnostic Segmentation
By: Jesse Brouwers, Xiaoyan Xing, Alexander Timans
Potential Business Impact:
Makes computer pictures more accurate in tricky spots.
Foundation models for segmentation such as the Segment Anything Model (SAM) family exhibit strong zero-shot performance, but remain vulnerable in shifted or limited-knowledge domains. This work investigates whether uncertainty quantification can mitigate such challenges and enhance model generalisability in a domain-agnostic manner. To this end, we (1) curate UncertSAM, a benchmark comprising eight datasets designed to stress-test SAM under challenging segmentation conditions including shadows, transparency, and camouflage; (2) evaluate a suite of lightweight, post-hoc uncertainty estimation methods; and (3) assess a preliminary uncertainty-guided prediction refinement step. Among evaluated approaches, a last-layer Laplace approximation yields uncertainty estimates that correlate well with segmentation errors, indicating a meaningful signal. While refinement benefits are preliminary, our findings underscore the potential of incorporating uncertainty into segmentation models to support robust, domain-agnostic performance. Our benchmark and code are made publicly available.
Similar Papers
UncertainSAM: Fast and Efficient Uncertainty Quantification of the Segment Anything Model
CV and Pattern Recognition
Makes computer pictures show what's uncertain.
Enhancing Self-Driving Segmentation in Adverse Weather Conditions: A Dual Uncertainty-Aware Training Approach to SAM Optimization
CV and Pattern Recognition
Helps self-driving cars see better in bad weather.
Uncertainty evaluation of segmentation models for Earth observation
CV and Pattern Recognition
Helps maps know where they might be wrong.