Single-Round Scalable Analytic Federated Learning
By: Alan T. L. Bacellar , Mustafa Munir , Felipe M. G. França and more
Potential Business Impact:
Trains AI faster without sharing private data.
Federated Learning (FL) is plagued by two key challenges: high communication overhead and performance collapse on heterogeneous (non-IID) data. Analytic FL (AFL) provides a single-round, data distribution invariant solution, but is limited to linear models. Subsequent non-linear approaches, like DeepAFL, regain accuracy but sacrifice the single-round benefit. In this work, we break this trade-off. We propose SAFLe, a framework that achieves scalable non-linear expressivity by introducing a structured head of bucketed features and sparse, grouped embeddings. We prove this non-linear architecture is mathematically equivalent to a high-dimensional linear regression. This key equivalence allows SAFLe to be solved with AFL's single-shot, invariant aggregation law. Empirically, SAFLe establishes a new state-of-the-art for analytic FL, significantly outperforming both linear AFL and multi-round DeepAFL in accuracy across all benchmarks, demonstrating a highly efficient and scalable solution for federated vision.
Similar Papers
APFL: Analytic Personalized Federated Learning via Dual-Stream Least Squares
Machine Learning (CS)
Makes AI learn better for everyone, even with different data.
Evaluation Framework for Centralized and Decentralized Aggregation Algorithm in Federated Systems
Distributed, Parallel, and Cluster Computing
Trains computers together without sharing private info.
Selective Attention Federated Learning: Improving Privacy and Efficiency for Clinical Text Classification
Computation and Language
Trains AI on private health data faster, safer.