Score: 0

Class Incremental Learning for Algorithm Selection

Published: June 2, 2025 | arXiv ID: 2506.01545v1

By: Mate Botond Nemeth , Emma Hart , Kevin Sim and more

Potential Business Impact:

Teaches computers to learn new tasks without forgetting old ones.

Business Areas:
Machine Learning Artificial Intelligence, Data and Analytics, Software

Algorithm selection is commonly used to predict the best solver from a portfolio per per-instance. In many real scenarios, instances arrive in a stream: new instances become available over time, while the number of class labels can also grow as new data distributions arrive downstream. As a result, the classification model needs to be periodically updated to reflect additional solvers without catastrophic forgetting of past data. In machine-learning (ML), this is referred to as Class Incremental Learning (CIL). While commonly addressed in ML settings, its relevance to algorithm-selection in optimisation has not been previously studied. Using a bin-packing dataset, we benchmark 8 continual learning methods with respect to their ability to withstand catastrophic forgetting. We find that rehearsal-based methods significantly outperform other CIL methods. While there is evidence of forgetting, the loss is small at around 7%. Hence, these methods appear to be a viable approach to continual learning in streaming optimisation scenarios.

Country of Origin
🇬🇧 United Kingdom

Page Count
4 pages

Category
Computer Science:
Machine Learning (CS)