Ranking pre-trained segmentation models for zero-shot transferability
By: Joshua Talks, Anna Kreshuk
Potential Business Impact:
Lets computers learn from old science pictures.
Model transfer presents a solution to the challenges of segmentation in the microscopy community, where the immense cost of labelling sufficient training data is a major bottleneck in the use of deep learning. With large quantities of imaging data produced across a wide range of imaging conditions, institutes also produce many bespoke models trained on specific source data which then get collected in model banks or zoos. As the number of available models grows, so does the need for an efficient and reliable model selection method for a specific target dataset of interest. We focus on the unsupervised regime where no labels are available for the target dataset. Building on previous work linking model generalisation and consistency under perturbation, we propose the first unsupervised transferability estimator for semantic and instance segmentation tasks which doesn't require access to source training data or target domain labels. We evaluate the method on multiple segmentation problems across microscopy modalities, finding a strong correlation between the rankings based on our estimator and rankings based on target dataset performance.
Similar Papers
No Labels Needed: Zero-Shot Image Classification with Collaborative Self-Learning
CV and Pattern Recognition
Teaches computers to sort pictures with no examples.
CellStyle: Improved Zero-Shot Cell Segmentation via Style Transfer
Machine Learning (CS)
Helps computers see cells without needing new labels.
Split Matching for Inductive Zero-shot Semantic Segmentation
CV and Pattern Recognition
Teaches computers to identify new things without training.