Score: 1

How Quantization Impacts Privacy Risk on LLMs for Code?

Published: July 31, 2025 | arXiv ID: 2508.00128v1

By: Md Nazmul Haque , Hua Yang , Zhou Yang and more

Potential Business Impact:

Makes AI code tools safer by hiding private data.

Large language models for code (LLMs4Code) rely heavily on massive training data, including sensitive data, such as cloud service credentials of the projects and personal identifiable information of the developers, raising serious privacy concerns. Membership inference (MI) has recently emerged as an effective tool for assessing privacy risk by identifying whether specific data belong to a model's training set. In parallel, model compression techniques, especially quantization, have gained traction for reducing computational costs and enabling the deployment of large models. However, while quantized models still retain knowledge learned from the original training data, it remains unclear whether quantization affects their ability to retain and expose privacy information. Answering this question is of great importance to understanding privacy risks in real-world deployments. In this work, we conduct the first empirical study on how quantization influences task performance and privacy risk simultaneously in LLMs4Code. To do this, we implement widely used quantization techniques (static and dynamic) to three representative model families, namely Pythia, CodeGen, and GPTNeo. Our results demonstrate that quantization has a significant impact on reducing the privacy risk relative to the original model. We also uncover a positive correlation between task performance and privacy risk, indicating an underlying tradeoff. Moreover, we reveal the possibility that quantizing larger models could yield better balance than using full-precision small models. Finally, we demonstrate that these findings generalize across different architectures, model sizes and MI methods, offering practical guidance for safeguarding privacy when deploying compressed LLMs4Code.

Country of Origin
πŸ‡ΊπŸ‡Έ πŸ‡¨πŸ‡¦ Canada, United States

Page Count
10 pages

Category
Computer Science:
Software Engineering