Memory-Statistics Tradeoff in Continual Learning with Structural Regularization
By: Haoran Li, Jingfeng Wu, Vladimir Braverman
Potential Business Impact:
Helps computers learn new things without forgetting old ones.
We study the statistical performance of a continual learning problem with two linear regression tasks in a well-specified random design setting. We consider a structural regularization algorithm that incorporates a generalized $\ell_2$-regularization tailored to the Hessian of the previous task for mitigating catastrophic forgetting. We establish upper and lower bounds on the joint excess risk for this algorithm. Our analysis reveals a fundamental trade-off between memory complexity and statistical efficiency, where memory complexity is measured by the number of vectors needed to define the structural regularization. Specifically, increasing the number of vectors in structural regularization leads to a worse memory complexity but an improved excess risk, and vice versa. Furthermore, our theory suggests that naive continual learning without regularization suffers from catastrophic forgetting, while structural regularization mitigates this issue. Notably, structural regularization achieves comparable performance to joint training with access to both tasks simultaneously. These results highlight the critical role of curvature-aware regularization for continual learning.
Similar Papers
From Continual Learning to SGD and Back: Better Rates for Continual Linear Models
Machine Learning (CS)
Prevents AI from forgetting old lessons when learning new ones.
Optimal Rates in Continual Linear Regression via Increasing Regularization
Machine Learning (CS)
Teaches computers to learn new things without forgetting.
Parabolic Continual Learning
Machine Learning (CS)
Teaches computers to remember without forgetting.