Classifier-to-Bias: Toward Unsupervised Automatic Bias Detection for Visual Classifiers
By: Quentin Guimard , Moreno D'Incà , Massimiliano Mancini and more
Potential Business Impact:
Finds hidden unfairness in computer programs.
A person downloading a pre-trained model from the web should be aware of its biases. Existing approaches for bias identification rely on datasets containing labels for the task of interest, something that a non-expert may not have access to, or may not have the necessary resources to collect: this greatly limits the number of tasks where model biases can be identified. In this work, we present Classifier-to-Bias (C2B), the first bias discovery framework that works without access to any labeled data: it only relies on a textual description of the classification task to identify biases in the target classification model. This description is fed to a large language model to generate bias proposals and corresponding captions depicting biases together with task-specific target labels. A retrieval model collects images for those captions, which are then used to assess the accuracy of the model w.r.t. the given biases. C2B is training-free, does not require any annotations, has no constraints on the list of biases, and can be applied to any pre-trained model on any classification task. Experiments on two publicly available datasets show that C2B discovers biases beyond those of the original datasets and outperforms a recent state-of-the-art bias detection baseline that relies on task-specific annotations, being a promising first step toward addressing task-agnostic unsupervised bias detection.
Similar Papers
AutoDebias: Automated Framework for Debiasing Text-to-Image Models
CV and Pattern Recognition
Fixes AI art to show everyone fairly.
Automatic Bias Detection in Source Code Review
Software Engineering
Finds unfairness in computer code reviews.
Evaluate Bias without Manual Test Sets: A Concept Representation Perspective for LLMs
Computation and Language
Finds hidden unfairness in AI's thinking.