Modular Federated Learning: A Meta-Framework Perspective
By: Frederico Vicente, Cláudia Soares, Dušan Jakovetić
Potential Business Impact:
Lets computers learn together without sharing private data.
Federated Learning (FL) enables distributed machine learning training while preserving privacy, representing a paradigm shift for data-sensitive and decentralized environments. Despite its rapid advancements, FL remains a complex and multifaceted field, requiring a structured understanding of its methodologies, challenges, and applications. In this survey, we introduce a meta-framework perspective, conceptualising FL as a composition of modular components that systematically address core aspects such as communication, optimisation, security, and privacy. We provide a historical contextualisation of FL, tracing its evolution from distributed optimisation to modern distributed learning paradigms. Additionally, we propose a novel taxonomy distinguishing Aggregation from Alignment, introducing the concept of alignment as a fundamental operator alongside aggregation. To bridge theory with practice, we explore available FL frameworks in Python, facilitating real-world implementation. Finally, we systematise key challenges across FL sub-fields, providing insights into open research questions throughout the meta-framework modules. By structuring FL within a meta-framework of modular components and emphasising the dual role of Aggregation and Alignment, this survey provides a holistic and adaptable foundation for understanding and advancing FL research and deployment.
Similar Papers
Principles and Components of Federated Learning Architectures
Machine Learning (CS)
Trains computers without sharing private data.
Federated Learning: A Survey on Privacy-Preserving Collaborative Intelligence
Machine Learning (CS)
Trains computers together without sharing private info.
A Generalized Meta Federated Learning Framework with Theoretical Convergence Guarantees
Machine Learning (CS)
Helps AI learn better from many separate computers.