CompoDistill: Attention Distillation for Compositional Reasoning in Multimodal LLMs
By: Jiwan Kim , Kibum Kim , Sangwoo Seo and more
Potential Business Impact:
Makes smart AI understand pictures better.
Recently, efficient Multimodal Large Language Models (MLLMs) have gained significant attention as a solution to their high computational complexity, making them more practical for real-world applications. In this regard, the knowledge distillation (KD) approach has emerged as a promising alternative, which transfers the rich visual and linguistic knowledge from a larger model (teacher) to a smaller model (student). However, we observe that existing KD methods struggle to effectively distill the teacher MLLM's rich visual perception abilities to the student, a challenge that has been largely overlooked in previous studies. Through a systematic analysis, we identify visual attention misalignment between student and teacher as the main cause of this issue. Based on this insight, we propose CompoDistill, a novel KD framework that explicitly aligns the student's visual attention with that of the teacher to enhance the student's visual perception abilities. Our extensive experiments show that CompoDistill significantly improves performance on compositional reasoning tasks that require visual perception abilities while maintaining strong performance on visual question answering tasks, as done in existing studies. Furthermore, CompoDistill demonstrates effectiveness with a more advanced backbone, highlighting its generalizability.
Similar Papers
When Better Teachers Don't Make Better Students: Revisiting Knowledge Distillation for CLIP Models in VQA
CV and Pattern Recognition
Makes smart AI models smaller and faster.
EM-KD: Distilling Efficient Multimodal Large Language Model with Unbalanced Vision Tokens
CV and Pattern Recognition
Makes AI understand pictures better without using more power.
Feature Alignment and Representation Transfer in Knowledge Distillation for Large Language Models
Computation and Language
Makes smart computer programs smaller and faster.