Learning-Augmented Perfectly Secure Collaborative Matrix Multiplication
By: Zixuan He , Mohammad Reza Deylam Salehi , Derya Malak and more
This paper presents a perfectly secure matrix multiplication (PSMM) protocol for multiparty computation (MPC) of $\mathrm{A}^{\top}\mathrm{B}$ over finite fields. The proposed scheme guarantees correctness and information-theoretic privacy against threshold-bounded, semi-honest colluding agents, under explicit local storage constraints. Our scheme encodes submatrices as evaluations of sparse masking polynomials and combines coefficient alignment with Beaver-style randomness to ensure perfect secrecy. We demonstrate that any colluding set of parties below the security threshold observes uniformly random shares, and that the recovery threshold is optimal, matching existing information-theoretic limits. Building on this framework, we introduce a learning-augmented extension that integrates tensor-decomposition-based local block multiplication, capturing both classical and learned low-rank methods. We demonstrate that the proposed learning-based PSMM preserves privacy and recovery guarantees for MPC, while providing scalable computational efficiency gains (up to $80\%$) as the matrix dimensions grow.
Similar Papers
Secure Sparse Matrix Multiplications and their Applications to Privacy-Preserving Machine Learning
Cryptography and Security
Lets computers learn from private data faster.
Quantum Private Distributed Matrix Multiplication With Degree Tables
Information Theory
Makes private math calculations faster using quantum tricks.
Analog Secure Distributed Matrix Multiplication
Information Theory
Keeps secret math calculations safe from spies.