A Conformal Predictive Measure for Assessing Catastrophic Forgetting
By: Ioannis Pitsiorlas, Nour Jamoussi, Marios Kountouris
Potential Business Impact:
Helps computers remember old lessons when learning new ones.
This work introduces a novel methodology for assessing catastrophic forgetting (CF) in continual learning. We propose a new conformal prediction (CP)-based metric, termed the Conformal Prediction Confidence Factor (CPCF), to quantify and evaluate CF effectively. Our framework leverages adaptive CP to estimate forgetting by monitoring the model's confidence on previously learned tasks. This approach provides a dynamic and practical solution for monitoring and measuring CF of previous tasks as new ones are introduced, offering greater suitability for real-world applications. Experimental results on four benchmark datasets demonstrate a strong correlation between CPCF and the accuracy of previous tasks, validating the reliability and interpretability of the proposed metric. Our results highlight the potential of CPCF as a robust and effective tool for assessing and understanding CF in dynamic learning environments.
Similar Papers
Benchmarking Catastrophic Forgetting Mitigation Methods in Federated Time Series Forecasting
Machine Learning (CS)
Keeps smart devices learning new things without forgetting.
On Conformal Machine Unlearning
Machine Learning (CS)
Makes AI forget specific data safely and accurately.
Mitigating Catastrophic Forgetting in Large Language Models with Forgetting-aware Pruning
Machine Learning (CS)
Keeps AI smart when learning new things.