Flipping Knowledge Distillation: Leveraging Small Models' Expertise to Enhance LLMs in Text Matching
By: Mingzhe Li , Jing Xiang , Qishen Zhang and more
Potential Business Impact:
Teaches big AI to learn from small AI.
Knowledge distillation typically involves transferring knowledge from a Large Language Model (LLM) to a Smaller Language Model (SLM). However, in tasks such as text matching, fine-tuned smaller models often yield more effective domain-specific representations, as they focus on optimizing the similarity of input pairs. To leverage both the specialized strengths of small models and the rich semantic understanding of LLMs, we introduce a flipped knowledge distillation paradigm, where LLM learns from SLM. Specifically, we address the architectural gap between decoder-only LLMs and smaller encoder-based models by reinterpreting LLMs in an encoder-decoder manner using LoRA. The encoder generates compressed representations, while the decoder maps them to the output space. During training, the encoder produces representations and their similarities, which are then aligned with the similarity scores produced by the teacher, using our proposed Margin-aware Contrastive Learning (MCL) approach. The MCL ensures accurate similarity for both positive and negative pairs, and adaptively handles the internal differences within positive and negative samples. Our paradigm requires only a reasonably good-performing SLM, allowing the LLM to achieve improved performance. Experiments on financial and healthcare benchmarks, as well as real-world applications, confirm its effectiveness, and the model has been fully deployed in an online environment.
Similar Papers
Distilling Empathy from Large Language Models
Computation and Language
Makes small AI models more caring and helpful.
Bidirectional Knowledge Distillation for Enhancing Sequential Recommendation with Large Language Models
Information Retrieval
Improves movie suggestions by learning from two AI types.
Knowledge Distillation and Dataset Distillation of Large Language Models: Emerging Trends, Challenges, and Future Directions
Computation and Language
Makes big AI models smaller and faster.