Score: 0

Information-Theoretic Criteria for Knowledge Distillation in Multimodal Learning

Published: October 15, 2025 | arXiv ID: 2510.13182v1

By: Rongrong Xie, Yizhou Xu, Guido Sanguinetti

Potential Business Impact:

Teaches computers to learn better from different kinds of information.

Business Areas:
Knowledge Management Administrative Services

The rapid increase in multimodal data availability has sparked significant interest in cross-modal knowledge distillation (KD) techniques, where richer "teacher" modalities transfer information to weaker "student" modalities during model training to improve performance. However, despite successes across various applications, cross-modal KD does not always result in improved outcomes, primarily due to a limited theoretical understanding that could inform practice. To address this gap, we introduce the Cross-modal Complementarity Hypothesis (CCH): we propose that cross-modal KD is effective when the mutual information between teacher and student representations exceeds the mutual information between the student representation and the labels. We theoretically validate the CCH in a joint Gaussian model and further confirm it empirically across diverse multimodal datasets, including image, text, video, audio, and cancer-related omics data. Our study establishes a novel theoretical framework for understanding cross-modal KD and offers practical guidelines based on the CCH criterion to select optimal teacher modalities for improving the performance of weaker modalities.

Page Count
27 pages

Category
Computer Science:
Machine Learning (CS)