Online Continual Graph Learning
By: Giovanni Donghi , Luca Pasa , Daniele Zambon and more
Potential Business Impact:
Teaches computers to learn new things without forgetting.
The aim of Continual Learning (CL) is to learn new tasks incrementally while avoiding catastrophic forgetting. Online Continual Learning (OCL) specifically focuses on learning efficiently from a continuous stream of data with shifting distribution. While recent studies explore Continual Learning on graphs exploiting Graph Neural Networks (GNNs), only few of them focus on a streaming setting. Yet, many real-world graphs evolve over time, often requiring timely and online predictions. Current approaches, however, are not well aligned with the standard OCL setting, partly due to the lack of a clear definition of online Continual Learning on graphs. In this work, we propose a general formulation for online Continual Learning on graphs, emphasizing the efficiency requirements on batch processing over the graph topology, and providing a well-defined setting for systematic model evaluation. Finally, we introduce a set of benchmarks and report the performance of several methods in the CL literature, adapted to our setting.
Similar Papers
Gradient-free Continual Learning
Machine Learning (CS)
Teaches computers new things without forgetting old ones.
Continual Reinforcement Learning for Cyber-Physical Systems: Lessons Learned and Open Challenges
Machine Learning (CS)
Teaches self-driving cars to learn new parking spots.
Real-time Continual Learning on Intel Loihi 2
Machine Learning (CS)
Lets AI learn new things without forgetting old ones.