Class Incremental Learning for Algorithm Selection
By: Mate Botond Nemeth , Emma Hart , Kevin Sim and more
Potential Business Impact:
Teaches computers to learn new tasks without forgetting old ones.
Algorithm selection is commonly used to predict the best solver from a portfolio per per-instance. In many real scenarios, instances arrive in a stream: new instances become available over time, while the number of class labels can also grow as new data distributions arrive downstream. As a result, the classification model needs to be periodically updated to reflect additional solvers without catastrophic forgetting of past data. In machine-learning (ML), this is referred to as Class Incremental Learning (CIL). While commonly addressed in ML settings, its relevance to algorithm-selection in optimisation has not been previously studied. Using a bin-packing dataset, we benchmark 8 continual learning methods with respect to their ability to withstand catastrophic forgetting. We find that rehearsal-based methods significantly outperform other CIL methods. While there is evidence of forgetting, the loss is small at around 7%. Hence, these methods appear to be a viable approach to continual learning in streaming optimisation scenarios.
Similar Papers
Noise-Tolerant Coreset-Based Class Incremental Continual Learning
Machine Learning (CS)
Teaches computers to learn new things without forgetting.
Class-Independent Increment: An Efficient Approach for Multi-label Class-Incremental Learning
CV and Pattern Recognition
Teaches computers to learn new things without forgetting old ones.
Continual Multiple Instance Learning for Hematologic Disease Diagnosis
Machine Learning (CS)
Helps doctors diagnose diseases better over time.