Score: 3

Advancing Multiple Instance Learning with Continual Learning for Whole Slide Imaging

Published: May 15, 2025 | arXiv ID: 2505.10649v1

By: Xianrui Li , Yufei Cui , Jun Li and more

BigTech Affiliations: Huawei

Potential Business Impact:

Helps AI learn new medical images without forgetting.

Business Areas:
Machine Learning Artificial Intelligence, Data and Analytics, Software

Advances in medical imaging and deep learning have propelled progress in whole slide image (WSI) analysis, with multiple instance learning (MIL) showing promise for efficient and accurate diagnostics. However, conventional MIL models often lack adaptability to evolving datasets, as they rely on static training that cannot incorporate new information without extensive retraining. Applying continual learning (CL) to MIL models is a possible solution, but often sees limited improvements. In this paper, we analyze CL in the context of attention MIL models and find that the model forgetting is mainly concentrated in the attention layers of the MIL model. Using the results of this analysis we propose two components for improving CL on MIL: Attention Knowledge Distillation (AKD) and the Pseudo-Bag Memory Pool (PMP). AKD mitigates catastrophic forgetting by focusing on retaining attention layer knowledge between learning sessions, while PMP reduces the memory footprint by selectively storing only the most informative patches, or ``pseudo-bags'' from WSIs. Experimental evaluations demonstrate that our method significantly improves both accuracy and memory efficiency on diverse WSI datasets, outperforming current state-of-the-art CL methods. This work provides a foundation for CL in large-scale, weakly annotated clinical datasets, paving the way for more adaptable and resilient diagnostic models.

Country of Origin
🇨🇳 🇭🇰 Hong Kong, China

Page Count
15 pages

Category
Computer Science:
CV and Pattern Recognition