Continual learning via probabilistic exchangeable sequence modelling
By: Hanwen Xing, Christopher Yau
Potential Business Impact:
Teaches computers new things without forgetting old ones.
Continual learning (CL) refers to the ability to continuously learn and accumulate new knowledge while retaining useful information from past experiences. Although numerous CL methods have been proposed in recent years, it is not straightforward to deploy them directly to real-world decision-making problems due to their computational cost and lack of uncertainty quantification. To address these issues, we propose CL-BRUNO, a probabilistic, Neural Process-based CL model that performs scalable and tractable Bayesian update and prediction. Our proposed approach uses deep-generative models to create a unified probabilistic framework capable of handling different types of CL problems such as task- and class-incremental learning, allowing users to integrate information across different CL scenarios using a single model. Our approach is able to prevent catastrophic forgetting through distributional and functional regularisation without the need of retaining any previously seen samples, making it appealing to applications where data privacy or storage capacity is of concern. Experiments show that CL-BRUNO outperforms existing methods on both natural image and biomedical data sets, confirming its effectiveness in real-world applications.
Similar Papers
Gradient-free Continual Learning
Machine Learning (CS)
Teaches computers new things without forgetting old ones.
BECAME: BayEsian Continual Learning with Adaptive Model MErging
Machine Learning (CS)
Helps computers remember old and new lessons.
MLLM-CL: Continual Learning for Multimodal Large Language Models
Computation and Language
Lets AI learn new things without forgetting old ones.