Score: 0

Global Convergence of Continual Learning on Non-IID Data

Published: March 24, 2025 | arXiv ID: 2503.18511v1

By: Fei Zhu , Yujing Liu , Wenzhuo Liu and more

Potential Business Impact:

Teaches computers to learn new things without forgetting.

Business Areas:
Machine Learning Artificial Intelligence, Data and Analytics, Software

Continual learning, which aims to learn multiple tasks sequentially, has gained extensive attention. However, most existing work focuses on empirical studies, and the theoretical aspect remains under-explored. Recently, a few investigations have considered the theory of continual learning only for linear regressions, establishes the results based on the strict independent and identically distributed (i.i.d.) assumption and the persistent excitation on the feature data that may be difficult to verify or guarantee in practice. To overcome this fundamental limitation, in this paper, we provide a general and comprehensive theoretical analysis for continual learning of regression models. By utilizing the stochastic Lyapunov function and martingale estimation techniques, we establish the almost sure convergence results of continual learning under a general data condition for the first time. Additionally, without any excitation condition imposed on the data, the convergence rates for the forgetting and regret metrics are provided.

Page Count
18 pages

Category
Computer Science:
Machine Learning (CS)