Condensation-Concatenation Framework for Dynamic Graph Continual Learning
By: Tingxu Yan, Ye Yuan
Dynamic graphs are prevalent in real-world scenarios, where continuous structural changes induce catastrophic forgetting in graph neural networks (GNNs). While continual learning has been extended to dynamic graphs, existing methods overlook the effects of topological changes on existing nodes. To address it, we propose a novel framework for continual learning on dynamic graphs, named Condensation-Concatenation-based Continual Learning (CCC). Specifically, CCC first condenses historical graph snapshots into compact semantic representations while aiming to preserve the original label distribution and topological properties. Then it concatenates these historical embeddings with current graph representations selectively. Moreover, we refine the forgetting measure (FM) to better adapt to dynamic graph scenarios by quantifying the predictive performance degradation of existing nodes caused by structural updates. CCC demonstrates superior performance over state-of-the-art baselines across four real-world datasets in extensive experiments.
Similar Papers
Dynamic Graph Condensation
Machine Learning (CS)
Makes computer learning faster with smaller data.
Multi-view Graph Condensation via Tensor Decomposition
Machine Learning (CS)
Makes big computer graphs smaller for faster learning.
Online Continual Graph Learning
Machine Learning (CS)
Teaches computers to learn new things without forgetting.