Federated Learning Under Temporal Drift -- Mitigating Catastrophic Forgetting via Experience Replay
By: Sahasra Kokkula, Daniel David, Aaditya Baruah
Potential Business Impact:
Keeps smart computers remembering old lessons.
Federated Learning struggles under temporal concept drift where client data distributions shift over time. We demonstrate that standard FedAvg suffers catastrophic forgetting under seasonal drift on Fashion-MNIST, with accuracy dropping from 74% to 28%. We propose client-side experience replay, where each client maintains a small buffer of past samples mixed with current data during local training. This simple approach requires no changes to server aggregation. Experiments show that a 50-sample-per-class buffer restores performance to 78-82%, effectively preventing forgetting. Our ablation study reveals a clear memory-accuracy trade-off as buffer size increases.
Similar Papers
Mitigating Catastrophic Forgetting in Streaming Generative and Predictive Learning via Stateful Replay
Machine Learning (CS)
Keeps computer learning without forgetting old lessons.
Benchmarking Catastrophic Forgetting Mitigation Methods in Federated Time Series Forecasting
Machine Learning (CS)
Keeps smart devices learning new things without forgetting.
Accurate Forgetting for Heterogeneous Federated Continual Learning
Machine Learning (CS)
Helps AI learn without forgetting bad habits.