A Privacy-Preserving Federated Learning Method with Homomorphic Encryption in Omics Data
By: Yusaku Negoya , Feifei Cui , Zilong Zhang and more
Potential Business Impact:
Keeps medical secrets safe, still finds cures.
Omics data is widely employed in medical research to identify disease mechanisms and contains highly sensitive personal information. Federated Learning (FL) with Differential Privacy (DP) can ensure the protection of omics data privacy against malicious user attacks. However, FL with the DP method faces an inherent trade-off: stronger privacy protection degrades predictive accuracy due to injected noise. On the other hand, Homomorphic Encryption (HE) allows computations on encrypted data and enables aggregation of encrypted gradients without DP-induced noise can increase the predictive accuracy. However, it may increase the computation cost. To improve the predictive accuracy while considering the computational ability of heterogeneous clients, we propose a Privacy-Preserving Machine Learning (PPML)-Hybrid method by introducing HE. In the proposed PPML-Hybrid method, clients distributed select either HE or DP based on their computational resources, so that HE clients contribute noise-free updates while DP clients reduce computational overhead. Meanwhile, clients with high computational resources clients can flexibly adopt HE or DP according to their privacy needs. Performance evaluation on omics datasets show that our proposed method achieves comparable predictive accuracy while significantly reducing computation time relative to HE-only. Additionally, it outperforms DP-only methods under equivalent or stricter privacy budgets.
Similar Papers
A Privacy-Centric Approach: Scalable and Secure Federated Learning Enabled by Hybrid Homomorphic Encryption
Cryptography and Security
Lets computers learn together safely, without sharing secrets.
Privacy Preserving Machine Learning Model Personalization through Federated Personalized Learning
Machine Learning (CS)
Keeps your private data safe when AI learns.
Secure Distributed Learning for CAVs: Defending Against Gradient Leakage with Leveled Homomorphic Encryption
Cryptography and Security
Keeps car data private while learning.