Online Continual Graph Learning
By: Giovanni Donghi , Luca Pasa , Daniele Zambon and more
Potential Business Impact:
Teaches computers to learn new things without forgetting.
The aim of Continual Learning (CL) is to learn new tasks incrementally while avoiding catastrophic forgetting. Online Continual Learning (OCL) specifically focuses on learning efficiently from a continuous stream of data with shifting distribution. While recent studies explore Continual Learning on graphs exploiting Graph Neural Networks (GNNs), only few of them focus on a streaming setting. Yet, many real-world graphs evolve over time, often requiring timely and online predictions. Current approaches, however, are not well aligned with the standard OCL setting, partly due to the lack of a clear definition of online Continual Learning on graphs. In this work, we propose a general formulation for online Continual Learning on graphs, emphasizing the efficiency requirements on batch processing over the graph topology, and providing a well-defined setting for systematic model evaluation. Finally, we introduce a set of benchmarks and report the performance of several methods in the CL literature, adapted to our setting.
Similar Papers
Online Continual Learning: A Systematic Literature Review of Approaches, Challenges, and Benchmarks
Machine Learning (CS)
Helps computers learn new things without forgetting old ones.
Gradient-free Continual Learning
Machine Learning (CS)
Teaches computers new things without forgetting old ones.
Continual Learning Should Move Beyond Incremental Classification
Machine Learning (CS)
Teaches computers to learn new things without forgetting.