Privacy-Preserving CNN Training with Transfer Learning: Two Hidden Layers
By: John Chiang
Potential Business Impact:
Trains computers on secret data without seeing it.
In this paper, we present the demonstration of training a four-layer neural network entirely using fully homomorphic encryption (FHE), supporting both single-output and multi-output classification tasks in a non-interactive setting. A key contribution of our work is identifying that replacing \textit{Softmax} with \textit{Sigmoid}, in conjunction with the Binary Cross-Entropy (BCE) loss function, provides an effective and scalable solution for homomorphic classification. Moreover, we show that the BCE loss function, originally designed for multi-output tasks, naturally extends to the multi-class setting, thereby enabling broader applicability. We also highlight the limitations of prior loss functions such as the SLE loss and the one proposed in the 2019 CVPR Workshop, both of which suffer from vanishing gradients as network depth increases. To address the challenges posed by large-scale encrypted data, we further introduce an improved version of the previously proposed data encoding scheme, \textit{Double Volley Revolver}, which achieves a better trade-off between computational and memory efficiency, making FHE-based neural network training more practical. The complete, runnable C++ code to implement our work can be found at: \href{https://github.com/petitioner/ML.NNtraining}{$\texttt{https://github.com/petitioner/ML.NNtraining}$}.
Similar Papers
CryptoUNets: Applying Convolutional Networks to Encrypted Data for Biomedical Image Segmentation
Cryptography and Security
Lets computers analyze private pictures safely.
Privacy-Preserving Federated Vision Transformer Learning Leveraging Lightweight Homomorphic Encryption in Medical AI
CV and Pattern Recognition
Keeps patient data safe while improving medical image analysis.
Evaluation of Privacy-aware Support Vector Machine (SVM) Learning using Homomorphic Encryption
Cryptography and Security
Keeps private data safe during computer learning.