Cluster-guided LLM-Based Anonymization of Software Analytics Data: Studying Privacy-Utility Trade-offs in JIT Defect Prediction
By: Maaz Khan , Gul Sher Khan , Ahsan Raza and more
Potential Business Impact:
Keeps software secrets safe while still predicting problems.
The increasing use of machine learning (ML) for Just-In-Time (JIT) defect prediction raises concerns about privacy leakage from software analytics data. Existing anonymization methods, such as tabular transformations and graph perturbations, often overlook contextual dependencies among software metrics, leading to suboptimal privacy-utility tradeoffs. Leveraging the contextual reasoning of Large Language Models (LLMs), we propose a cluster-guided anonymization technique that preserves contextual and statistical relationships within JIT datasets. Our method groups commits into feature-based clusters and employs an LLM to generate context-aware parameter configurations for each commit cluster, defining alpha-beta ratios and churn mixture distributions used for anonymization. Our evaluation on six projects (Cassandra, Flink, Groovy, Ignite, OpenStack, and Qt) shows that our LLM-based approach achieves privacy level 2 (IPR >= 80 percent), improving privacy by 18 to 25 percent over four state-of-the-art graph-based anonymization baselines while maintaining comparable F1 scores. Our results demonstrate that LLMs can act as adaptive anonymization engines when provided with cluster-specific statistical information about similar data points, enabling context-sensitive and privacy-preserving software analytics without compromising predictive accuracy.
Similar Papers
Position: Privacy Is Not Just Memorization!
Cryptography and Security
Protects your secrets from smart computer programs.
Preserving Privacy and Utility in LLM-Based Product Recommendations
Information Retrieval
Keeps your private info safe while suggesting cool stuff.
RL-Finetuned LLMs for Privacy-Preserving Synthetic Rewriting
Cryptography and Security
Keeps your secrets safe when computers learn.