ReLKD: Inter-Class Relation Learning with Knowledge Distillation for Generalized Category Discovery
By: Fang Zhou , Zhiqiang Chen , Martin Pavlovski and more
Potential Business Impact:
Teaches computers to sort new things they haven't seen.
Generalized Category Discovery (GCD) faces the challenge of categorizing unlabeled data containing both known and novel classes, given only labels for known classes. Previous studies often treat each class independently, neglecting the inherent inter-class relations. Obtaining such inter-class relations directly presents a significant challenge in real-world scenarios. To address this issue, we propose ReLKD, an end-to-end framework that effectively exploits implicit inter-class relations and leverages this knowledge to enhance the classification of novel classes. ReLKD comprises three key modules: a target-grained module for learning discriminative representations, a coarse-grained module for capturing hierarchical class relations, and a distillation module for transferring knowledge from the coarse-grained module to refine the target-grained module's representation learning. Extensive experiments on four datasets demonstrate the effectiveness of ReLKD, particularly in scenarios with limited labeled data. The code for ReLKD is available at https://github.com/ZhouF-ECNU/ReLKD.
Similar Papers
Generalized Category Discovery via Reciprocal Learning and Class-Wise Distribution Regularization
CV and Pattern Recognition
Finds new things even when told what to look for.
Group Relative Knowledge Distillation: Learning from Teacher's Relational Inductive Bias
Machine Learning (CS)
Teaches computers to rank things better.
Long-Tailed Learning for Generalized Category Discovery
Artificial Intelligence
Finds new things even when there are few.