Score: 0

Federated Learning Under Temporal Drift -- Mitigating Catastrophic Forgetting via Experience Replay

Published: January 19, 2026 | arXiv ID: 2601.13456v1

By: Sahasra Kokkula, Daniel David, Aaditya Baruah

Potential Business Impact:

Keeps smart computers remembering old lessons.

Business Areas:
A/B Testing Data and Analytics

Federated Learning struggles under temporal concept drift where client data distributions shift over time. We demonstrate that standard FedAvg suffers catastrophic forgetting under seasonal drift on Fashion-MNIST, with accuracy dropping from 74% to 28%. We propose client-side experience replay, where each client maintains a small buffer of past samples mixed with current data during local training. This simple approach requires no changes to server aggregation. Experiments show that a 50-sample-per-class buffer restores performance to 78-82%, effectively preventing forgetting. Our ablation study reveals a clear memory-accuracy trade-off as buffer size increases.

Country of Origin
🇺🇸 United States

Page Count
8 pages

Category
Computer Science:
Machine Learning (CS)