Open-Attribute Recognition for Person Retrieval: Finding People Through Distinctive and Novel Attributes
By: Minjeong Park , Hongbeen Park , Sangwon Lee and more
Potential Business Impact:
Helps find people even with new descriptions.
Pedestrian Attribute Recognition (PAR) plays a crucial role in various vision tasks such as person retrieval and identification. Most existing attribute-based retrieval methods operate under the closed-set assumption that all attribute classes are consistently available during both training and inference. However, this assumption limits their applicability in real-world scenarios where novel attributes may emerge. Moreover, predefined attributes in benchmark datasets are often generic and shared across individuals, making them less discriminative for retrieving the target person. To address these challenges, we propose the Open-Attribute Recognition for Person Retrieval (OAPR) task, which aims to retrieve individuals based on attribute cues, regardless of whether those attributes were seen during training. To support this task, we introduce a novel framework designed to learn generalizable body part representations that cover a broad range of attribute categories. Furthermore, we reconstruct four widely used datasets for open-attribute recognition. Comprehensive experiments on these datasets demonstrate the necessity of the OAPR task and the effectiveness of our framework. The source code and pre-trained models will be publicly available upon publication.
Similar Papers
A Data-Centric Approach to Pedestrian Attribute Recognition: Synthetic Augmentation via Prompt-driven Diffusion Models
CV and Pattern Recognition
Makes computers better at recognizing people in photos.
Person Re-Identification System at Semantic Level based on Pedestrian Attributes Ontology
CV and Pattern Recognition
Finds people in videos using their clothes.
Omni-Attribute: Open-vocabulary Attribute Encoder for Visual Concept Personalization
CV and Pattern Recognition
Lets computers change just one thing in a picture.