Score: 1

ReLKD: Inter-Class Relation Learning with Knowledge Distillation for Generalized Category Discovery

Published: December 8, 2025 | arXiv ID: 2512.07229v1

By: Fang Zhou , Zhiqiang Chen , Martin Pavlovski and more

Potential Business Impact:

Teaches computers to sort new things they haven't seen.

Business Areas:
Image Recognition Data and Analytics, Software

Generalized Category Discovery (GCD) faces the challenge of categorizing unlabeled data containing both known and novel classes, given only labels for known classes. Previous studies often treat each class independently, neglecting the inherent inter-class relations. Obtaining such inter-class relations directly presents a significant challenge in real-world scenarios. To address this issue, we propose ReLKD, an end-to-end framework that effectively exploits implicit inter-class relations and leverages this knowledge to enhance the classification of novel classes. ReLKD comprises three key modules: a target-grained module for learning discriminative representations, a coarse-grained module for capturing hierarchical class relations, and a distillation module for transferring knowledge from the coarse-grained module to refine the target-grained module's representation learning. Extensive experiments on four datasets demonstrate the effectiveness of ReLKD, particularly in scenarios with limited labeled data. The code for ReLKD is available at https://github.com/ZhouF-ECNU/ReLKD.

Country of Origin
🇨🇳 China

Repos / Data Links

Page Count
8 pages

Category
Computer Science:
CV and Pattern Recognition