CLIP-Flow: A Universal Discriminator for AI-Generated Images Inspired by Anomaly Detection
By: Zhipeng Yuan , Kai Wang , Weize Quan and more
Potential Business Impact:
Finds fake pictures made by computers.
With the rapid advancement of AI generative models, the visual quality of AI-generated images (AIIs) has become increasingly close to natural images, which inevitably raises security concerns. Most AII detectors often employ the conventional image classification pipeline with natural images and AIIs (generated by a generative model), which can result in limited detection performance for AIIs from unseen generative models. To solve this, we proposed a universal AI-generated image detector from the perspective of anomaly detection. Our discriminator does not need to access any AIIs and learn a generalizable representation with unsupervised learning. Specifically, we use the pre-trained CLIP encoder as the feature extractor and design a normalizing flow-like unsupervised model. Instead of AIIs, proxy images, e.g., obtained by applying a spectral modification operation on natural images, are used for training. Our models are trained by minimizing the likelihood of proxy images, optionally combined with maximizing the likelihood of natural images. Extensive experiments demonstrate the effectiveness of our method on AIIs produced by various image generators.
Similar Papers
CausalCLIP: Causally-Informed Feature Disentanglement and Filtering for Generalizable Detection of Generated Images
CV and Pattern Recognition
Finds fake pictures even if made by new tools.
CausalCLIP: Causally-Informed Feature Disentanglement and Filtering for Generalizable Detection of Generated Images
CV and Pattern Recognition
Finds fake pictures even from new AI.
DeeCLIP: A Robust and Generalizable Transformer-Based Framework for Detecting AI-Generated Images
CV and Pattern Recognition
Finds fake pictures made by computers.