Adaptive Federated Learning with Functional Encryption: A Comparison of Classical and Quantum-safe Options
By: Enrico Sorbera , Federica Zanetti , Giacomo Brandi and more
Potential Business Impact:
Protects private data when computers learn together.
Federated Learning (FL) is a collaborative method for training machine learning models while preserving the confidentiality of the participants' training data. Nevertheless, FL is vulnerable to reconstruction attacks that exploit shared parameters to reveal private training data. In this paper, we address this issue in the cybersecurity domain by applying Multi-Input Functional Encryption (MIFE) to a recent FL implementation for training ML-based network intrusion detection systems. We assess both classical and post-quantum solutions in terms of memory cost and computational overhead in the FL process, highlighting their impact on convergence time.
Similar Papers
Emerging Paradigms for Securing Federated Learning Systems
Cryptography and Security
Makes AI learn from data without seeing it.
VFEFL: Privacy-Preserving Federated Learning against Malicious Clients via Verifiable Functional Encryption
Cryptography and Security
Keeps your private data safe when computers learn together.
On the Security and Privacy of Federated Learning: A Survey with Attacks, Defenses, Frameworks, Applications, and Future Directions
Cryptography and Security
Keeps shared computer learning private and safe.