Score: 1

Breaking the Layer Barrier: Remodeling Private Transformer Inference with Hybrid CKKS and MPC

Published: August 27, 2025 | arXiv ID: 2508.19525v2

By: Tianshi Xu , Wen-jie Lu , Jiangrui Yu and more

Potential Business Impact:

Keeps your computer secrets safe during use.

Business Areas:
Darknet Internet Services

This paper presents an efficient framework for private Transformer inference that combines Homomorphic Encryption (HE) and Secure Multi-party Computation (MPC) to protect data privacy. Existing methods often leverage HE for linear layers (e.g., matrix multiplications) and MPC for non-linear layers (e.g., Softmax activation functions), but the conversion between HE and MPC introduces significant communication costs. The proposed framework, dubbed BLB, overcomes this by breaking down layers into fine-grained operators and further fusing adjacent linear operators, reducing the need for HE/MPC conversions. To manage the increased ciphertext bit width from the fused linear operators, BLB proposes the first secure conversion protocol between CKKS and MPC and enables CKKS-based computation of the fused operators. Additionally, BLB proposes an efficient matrix multiplication protocol for fused computation in Transformers. Extensive evaluations on BERT-base, BERT-large, and GPT2-base show that BLB achieves a $21\times$ reduction in communication overhead compared to BOLT (S\&P'24) and a $2\times$ reduction compared to Bumblebee (NDSS'25), along with latency reductions of $13\times$ and $1.8\times$, respectively, when leveraging GPU acceleration.

Country of Origin
🇨🇳 China

Page Count
20 pages

Category
Computer Science:
Cryptography and Security