Generating Risky Samples with Conformity Constraints via Diffusion Models
By: Han Yu , Hao Zou , Xingxuan Zhang and more
Although neural networks achieve promising performance in many tasks, they may still fail when encountering some examples and bring about risks to applications. To discover risky samples, previous literature attempts to search for patterns of risky samples within existing datasets or inject perturbation into them. Yet in this way the diversity of risky samples is limited by the coverage of existing datasets. To overcome this limitation, recent works adopt diffusion models to produce new risky samples beyond the coverage of existing datasets. However, these methods struggle in the conformity between generated samples and expected categories, which could introduce label noise and severely limit their effectiveness in applications. To address this issue, we propose RiskyDiff that incorporates the embeddings of both texts and images as implicit constraints of category conformity. We also design a conformity score to further explicitly strengthen the category conformity, as well as introduce the mechanisms of embedding screening and risky gradient guidance to boost the risk of generated samples. Extensive experiments reveal that RiskyDiff greatly outperforms existing methods in terms of the degree of risk, generation quality, and conformity with conditioned categories. We also empirically show the generalization ability of the models can be enhanced by augmenting training data with generated samples of high conformity.
Similar Papers
Composition and Alignment of Diffusion Models using Constrained Learning
Machine Learning (CS)
Makes AI create better pictures with many rules.
Softly Constrained Denoisers for Diffusion Models
Machine Learning (CS)
Makes AI create images that follow rules better.
Conditional Diffusion as Latent Constraints for Controllable Symbolic Music Generation
Machine Learning (CS)
Lets musicians precisely control music creation.