Hierarchy-Consistent Learning and Adaptive Loss Balancing for Hierarchical Multi-Label Classification
By: Ruobing Jiang , Mengzhe Liu , Haobing Liu and more
Potential Business Impact:
Helps computers understand complex categories better.
Hierarchical Multi-Label Classification (HMC) faces critical challenges in maintaining structural consistency and balancing loss weighting in Multi-Task Learning (MTL). In order to address these issues, we propose a classifier called HCAL based on MTL integrated with prototype contrastive learning and adaptive task-weighting mechanisms. The most significant advantage of our classifier is semantic consistency including both prototype with explicitly modeling label and feature aggregation from child classes to parent classes. The other important advantage is an adaptive loss-weighting mechanism that dynamically allocates optimization resources by monitoring task-specific convergence rates. It effectively resolves the "one-strong-many-weak" optimization bias inherent in traditional MTL approaches. To further enhance robustness, a prototype perturbation mechanism is formulated by injecting controlled noise into prototype to expand decision boundaries. Additionally, we formalize a quantitative metric called Hierarchical Violation Rate (HVR) as to evaluate hierarchical consistency and generalization. Extensive experiments across three datasets demonstrate both the higher classification accuracy and reduced hierarchical violation rate of the proposed classifier over baseline models.
Similar Papers
Climbing the label tree: Hierarchy-preserving contrastive learning for medical imaging
Quantitative Methods
Helps doctors understand medical pictures better.
MACL: Multi-Label Adaptive Contrastive Learning Loss for Remote Sensing Image Retrieval
CV and Pattern Recognition
Finds rare things in satellite pictures better.
Enforcing Consistency and Fairness in Multi-level Hierarchical Classification with a Mask-based Output Layer
Machine Learning (CS)
Makes smart sorting fair and accurate.