Score: 0

SyncFed: Time-Aware Federated Learning through Explicit Timestamping and Synchronization

Published: June 11, 2025 | arXiv ID: 2506.09660v1

By: Baran Can Gül , Stefanos Tziampazis , Nasser Jazdi and more

Potential Business Impact:

Makes AI learn better even with slow internet.

Business Areas:
Data Integration Data and Analytics, Information Technology, Software

As Federated Learning (FL) expands to larger and more distributed environments, consistency in training is challenged by network-induced delays, clock unsynchronicity, and variability in client updates. This combination of factors may contribute to misaligned contributions that undermine model reliability and convergence. Existing methods like staleness-aware aggregation and model versioning address lagging updates heuristically, yet lack mechanisms to quantify staleness, especially in latency-sensitive and cross-regional deployments. In light of these considerations, we introduce \emph{SyncFed}, a time-aware FL framework that employs explicit synchronization and timestamping to establish a common temporal reference across the system. Staleness is quantified numerically based on exchanged timestamps under the Network Time Protocol (NTP), enabling the server to reason about the relative freshness of client updates and apply temporally informed weighting during aggregation. Our empirical evaluation on a geographically distributed testbed shows that, under \emph{SyncFed}, the global model evolves within a stable temporal context, resulting in improved accuracy and information freshness compared to round-based baselines devoid of temporal semantics.

Country of Origin
🇩🇪 Germany

Page Count
12 pages

Category
Computer Science:
Machine Learning (CS)