Score: 1

Online Continual Graph Learning

Published: August 5, 2025 | arXiv ID: 2508.03283v1

By: Giovanni Donghi , Luca Pasa , Daniele Zambon and more

Potential Business Impact:

Teaches computers to learn new things without forgetting.

The aim of Continual Learning (CL) is to learn new tasks incrementally while avoiding catastrophic forgetting. Online Continual Learning (OCL) specifically focuses on learning efficiently from a continuous stream of data with shifting distribution. While recent studies explore Continual Learning on graphs exploiting Graph Neural Networks (GNNs), only few of them focus on a streaming setting. Yet, many real-world graphs evolve over time, often requiring timely and online predictions. Current approaches, however, are not well aligned with the standard OCL setting, partly due to the lack of a clear definition of online Continual Learning on graphs. In this work, we propose a general formulation for online Continual Learning on graphs, emphasizing the efficiency requirements on batch processing over the graph topology, and providing a well-defined setting for systematic model evaluation. Finally, we introduce a set of benchmarks and report the performance of several methods in the CL literature, adapted to our setting.

Country of Origin
🇮🇹 Italy

Repos / Data Links

Page Count
22 pages

Category
Computer Science:
Machine Learning (CS)