CellStyle: Improved Zero-Shot Cell Segmentation via Style Transfer
By: Rüveyda Yilmaz , Zhu Chen , Yuli Wu and more
Potential Business Impact:
Helps computers see cells without needing new labels.
Cell microscopy data are abundant; however, corresponding segmentation annotations remain scarce. Moreover, variations in cell types, imaging devices, and staining techniques introduce significant domain gaps between datasets. As a result, even large, pretrained segmentation models trained on diverse datasets (source datasets) struggle to generalize to unseen datasets (target datasets). To overcome this generalization problem, we propose CellStyle, which improves the segmentation quality of such models without requiring labels for the target dataset, thereby enabling zero-shot adaptation. CellStyle transfers the attributes of an unannotated target dataset, such as texture, color, and noise, to the annotated source dataset. This transfer is performed while preserving the cell shapes of the source images, ensuring that the existing source annotations can still be used while maintaining the visual characteristics of the target dataset. The styled synthetic images with the existing annotations enable the finetuning of a generalist segmentation model for application to the unannotated target data. We demonstrate that CellStyle significantly improves zero-shot cell segmentation performance across diverse datasets by finetuning multiple segmentation models on the style-transferred data. The code will be made publicly available.
Similar Papers
Ranking pre-trained segmentation models for zero-shot transferability
CV and Pattern Recognition
Lets computers learn from old science pictures.
Styleclone: Face Stylization with Diffusion Based Data Augmentation
CV and Pattern Recognition
Changes photos to look like a chosen style.
subCellSAM: Zero-Shot (Sub-)Cellular Segmentation for Hit Validation in Drug Discovery
Image and Video Processing
Finds new medicines faster by looking at cells.