Zero-Shot Image Anomaly Detection Using Generative Foundation Models
By: Lemar Abdi , Amaan Valiuddin , Francisco Caetano and more
Potential Business Impact:
Finds weird pictures computers haven't seen.
Detecting out-of-distribution (OOD) inputs is pivotal for deploying safe vision systems in open-world environments. We revisit diffusion models, not as generators, but as universal perceptual templates for OOD detection. This research explores the use of score-based generative models as foundational tools for semantic anomaly detection across unseen datasets. Specifically, we leverage the denoising trajectories of Denoising Diffusion Models (DDMs) as a rich source of texture and semantic information. By analyzing Stein score errors, amplified through the Structural Similarity Index Metric (SSIM), we introduce a novel method for identifying anomalous samples without requiring re-training on each target dataset. Our approach improves over state-of-the-art and relies on training a single model on one dataset -- CelebA -- which we find to be an effective base distribution, even outperforming more commonly used datasets like ImageNet in several settings. Experimental results show near-perfect performance on some benchmarks, with notable headroom on others, highlighting both the strength and future potential of generative foundation models in anomaly detection.
Similar Papers
Out-of-Distribution Detection in Medical Imaging via Diffusion Trajectories
CV and Pattern Recognition
Finds rare sicknesses in medical scans faster.
Anomalies by Synthesis: Anomaly Detection using Generative Diffusion Models for Off-Road Navigation
Robotics
Helps robots spot weird things in pictures.
GOOD: Training-Free Guided Diffusion Sampling for Out-of-Distribution Detection
CV and Pattern Recognition
Helps computers spot fake images better.