Score: 0

SMOTE and Mirrors: Exposing Privacy Leakage from Synthetic Minority Oversampling

Published: October 16, 2025 | arXiv ID: 2510.15083v1

By: Georgi Ganev , Reza Nazari , Rees Davison and more

Potential Business Impact:

Exposes private information in fake data.

Business Areas:
Penetration Testing Information Technology, Privacy and Security

The Synthetic Minority Over-sampling Technique (SMOTE) is one of the most widely used methods for addressing class imbalance and generating synthetic data. Despite its popularity, little attention has been paid to its privacy implications; yet, it is used in the wild in many privacy-sensitive applications. In this work, we conduct the first systematic study of privacy leakage in SMOTE: We begin by showing that prevailing evaluation practices, i.e., naive distinguishing and distance-to-closest-record metrics, completely fail to detect any leakage and that membership inference attacks (MIAs) can be instantiated with high accuracy. Then, by exploiting SMOTE's geometric properties, we build two novel attacks with very limited assumptions: DistinSMOTE, which perfectly distinguishes real from synthetic records in augmented datasets, and ReconSMOTE, which reconstructs real minority records from synthetic datasets with perfect precision and recall approaching one under realistic imbalance ratios. We also provide theoretical guarantees for both attacks. Experiments on eight standard imbalanced datasets confirm the practicality and effectiveness of these attacks. Overall, our work reveals that SMOTE is inherently non-private and disproportionately exposes minority records, highlighting the need to reconsider its use in privacy-sensitive applications.

Country of Origin
🇬🇧 United Kingdom

Page Count
15 pages

Category
Computer Science:
Cryptography and Security