Imbalance-Robust and Sampling-Efficient Continuous Conditional GANs via Adaptive Vicinity and Auxiliary Regularization
By: Xin Ding , Yun Chen , Yongwei Wang and more
Potential Business Impact:
Makes AI create realistic images faster.
Recent advances in conditional generative modeling have introduced Continuous conditional Generative Adversarial Network (CcGAN) and Continuous Conditional Diffusion Model (CCDM) for estimating high-dimensional data distributions conditioned on scalar, continuous regression labels (e.g., angles, ages, or temperatures). However, these approaches face fundamental limitations: CcGAN suffers from data imbalance due to fixed-size vicinity constraints, while CCDM requires computationally expensive iterative sampling. We present CcGAN-AVAR, an enhanced CcGAN framework that addresses both challenges: (1) leveraging the GAN framework's native one-step generation to overcome CCDMs' sampling bottleneck (achieving 300x-2000x faster inference), while (2) two novel components specifically target data imbalance - an adaptive vicinity mechanism that dynamically adjusts vicinity's size, and a multi-task discriminator that constructs two regularization terms (through auxiliary regression and density ratio estimation) to significantly improve generator training. Extensive experiments on four benchmark datasets (64x64 to 192x192 resolution) across eight challenging imbalanced settings demonstrate that CcGAN-AVAR achieves state-of-the-art generation quality while maintaining sampling efficiency.
Similar Papers
A Conditional GAN for Tabular Data Generation with Probabilistic Sampling of Latent Subspaces
Machine Learning (CS)
Makes computer data fair for better learning.
One-shot Conditional Sampling: MMD meets Nearest Neighbors
Machine Learning (Stat)
Makes computers create better pictures from less info.
Domain Translation of a Soft Robotic Arm using Conditional Cycle Generative Adversarial Network
Robotics
Teaches robots new skills without retraining them.