FOREVER: Forgetting Curve-Inspired Memory Replay for Language Model Continual Learning
By: Yujie Feng , Hao Wang , Jian Li and more
Potential Business Impact:
Keeps AI learning new things without forgetting.
Continual learning (CL) for large language models (LLMs) aims to enable sequential knowledge acquisition without catastrophic forgetting. Memory replay methods are widely used for their practicality and effectiveness, but most rely on fixed, step-based heuristics that often misalign with the model's actual learning progress, since identical training steps can result in varying degrees of parameter change. Motivated by recent findings that LLM forgetting mirrors the Ebbinghaus human forgetting curve, we propose FOREVER (FORgEtting curVe-inspired mEmory Replay), a novel CL framework that aligns replay schedules with a model-centric notion of time. FOREVER defines model time using the magnitude of optimizer updates, allowing forgetting curve-inspired replay intervals to align with the model's internal evolution rather than raw training steps. Building on this approach, FOREVER incorporates a forgetting curve-based replay scheduler to determine when to replay and an intensity-aware regularization mechanism to adaptively control how to replay. Extensive experiments on three CL benchmarks and models ranging from 0.6B to 13B parameters demonstrate that FOREVER consistently mitigates catastrophic forgetting.
Similar Papers
Forget Forgetting: Continual Learning in a World of Abundant Memory
Machine Learning (CS)
Teaches computers new things without forgetting old ones.
No Forgetting Learning: Memory-free Continual Learning
Machine Learning (CS)
Teaches computers new things without forgetting old ones.
GeRe: Towards Efficient Anti-Forgetting in Continual Learning of LLM via General Samples Replay
Computation and Language
Keeps AI smart when learning new things.