Score: 1

A Privacy-Preserving Federated Learning Method with Homomorphic Encryption in Omics Data

Published: November 8, 2025 | arXiv ID: 2511.06064v1

By: Yusaku Negoya , Feifei Cui , Zilong Zhang and more

Potential Business Impact:

Keeps medical secrets safe, still finds cures.

Business Areas:
Predictive Analytics Artificial Intelligence, Data and Analytics, Software

Omics data is widely employed in medical research to identify disease mechanisms and contains highly sensitive personal information. Federated Learning (FL) with Differential Privacy (DP) can ensure the protection of omics data privacy against malicious user attacks. However, FL with the DP method faces an inherent trade-off: stronger privacy protection degrades predictive accuracy due to injected noise. On the other hand, Homomorphic Encryption (HE) allows computations on encrypted data and enables aggregation of encrypted gradients without DP-induced noise can increase the predictive accuracy. However, it may increase the computation cost. To improve the predictive accuracy while considering the computational ability of heterogeneous clients, we propose a Privacy-Preserving Machine Learning (PPML)-Hybrid method by introducing HE. In the proposed PPML-Hybrid method, clients distributed select either HE or DP based on their computational resources, so that HE clients contribute noise-free updates while DP clients reduce computational overhead. Meanwhile, clients with high computational resources clients can flexibly adopt HE or DP according to their privacy needs. Performance evaluation on omics datasets show that our proposed method achieves comparable predictive accuracy while significantly reducing computation time relative to HE-only. Additionally, it outperforms DP-only methods under equivalent or stricter privacy budgets.

Country of Origin
πŸ‡―πŸ‡΅ πŸ‡ΊπŸ‡Έ Japan, United States

Page Count
6 pages

Category
Computer Science:
Cryptography and Security