Score: 2

Learning Task-Agnostic Representations through Multi-Teacher Distillation

Published: October 21, 2025 | arXiv ID: 2510.18680v1

By: Philippe Formont , Maxime Darrin , Banafsheh Karimian and more

Potential Business Impact:

Makes computer models learn better from many teachers.

Business Areas:
Machine Learning Artificial Intelligence, Data and Analytics, Software

Casting complex inputs into tractable representations is a critical step across various fields. Diverse embedding models emerge from differences in architectures, loss functions, input modalities and datasets, each capturing unique aspects of the input. Multi-teacher distillation leverages this diversity to enrich representations but often remains tailored to specific tasks. In this paper, we introduce a task-agnostic framework based on a ``majority vote" objective function. We demonstrate that this function is bounded by the mutual information between student and teachers' embeddings, leading to a task-agnostic distillation loss that eliminates dependence on task-specific labels or prior knowledge. Our evaluations across text, vision models, and molecular modeling show that our method effectively leverages teacher diversity, resulting in representations enabling better performance for a wide range of downstream tasks such as classification, clustering, or regression. Additionally, we train and release state-of-the-art embedding models, enhancing downstream performance in various modalities.


Page Count
47 pages

Category
Computer Science:
Machine Learning (CS)