Large Language Models for Imbalanced Classification: Diversity makes the difference
By: Dang Nguyen , Sunil Gupta , Kien Do and more
Potential Business Impact:
Makes computer learning better with more varied examples.
Oversampling is one of the most widely used approaches for addressing imbalanced classification. The core idea is to generate additional minority samples to rebalance the dataset. Most existing methods, such as SMOTE, require converting categorical variables into numerical vectors, which often leads to information loss. Recently, large language model (LLM)-based methods have been introduced to overcome this limitation. However, current LLM-based approaches typically generate minority samples with limited diversity, reducing robustness and generalizability in downstream classification tasks. To address this gap, we propose a novel LLM-based oversampling method designed to enhance diversity. First, we introduce a sampling strategy that conditions synthetic sample generation on both minority labels and features. Second, we develop a new permutation strategy for fine-tuning pre-trained LLMs. Third, we fine-tune the LLM not only on minority samples but also on interpolated samples to further enrich variability. Extensive experiments on 10 tabular datasets demonstrate that our method significantly outperforms eight SOTA baselines. The generated synthetic samples are both realistic and diverse. Moreover, we provide theoretical analysis through an entropy-based perspective, proving that our method encourages diversity in the generated samples.
Similar Papers
Synthetic Feature Augmentation Improves Generalization Performance of Language Models
Computation and Language
Makes AI fair when data is uneven.
Concentration and excess risk bounds for imbalanced classification with synthetic oversampling
Machine Learning (Stat)
Helps computers learn better from unfair data.
Extrapolated Markov Chain Oversampling Method for Imbalanced Text Classification
Machine Learning (CS)
Makes computer sorting of text fairer for small groups.