Score: 0

Uncertainty-Aware Dual-Student Knowledge Distillation for Efficient Image Classification

Published: November 24, 2025 | arXiv ID: 2511.18826v1

By: Aakash Gore, Anoushka Dey, Aryan Mishra

Potential Business Impact:

Teaches small computers to learn like big ones.

Business Areas:
Image Recognition Data and Analytics, Software

Knowledge distillation has emerged as a powerful technique for model compression, enabling the transfer of knowledge from large teacher networks to compact student models. However, traditional knowledge distillation methods treat all teacher predictions equally, regardless of the teacher's confidence in those predictions. This paper proposes an uncertainty-aware dual-student knowledge distillation framework that leverages teacher prediction uncertainty to selectively guide student learning. We introduce a peer-learning mechanism where two heterogeneous student architectures, specifically ResNet-18 and MobileNetV2, learn collaboratively from both the teacher network and each other. Experimental results on ImageNet-100 demonstrate that our approach achieves superior performance compared to baseline knowledge distillation methods, with ResNet-18 achieving 83.84\% top-1 accuracy and MobileNetV2 achieving 81.46\% top-1 accuracy, representing improvements of 2.04\% and 0.92\% respectively over traditional single-student distillation approaches.

Country of Origin
🇮🇳 India

Page Count
9 pages

Category
Computer Science:
CV and Pattern Recognition