Score: 1

Task Matrices: Linear Maps for Cross-Model Finetuning Transfer

Published: December 16, 2025 | arXiv ID: 2512.14880v1

By: Darrin O' Brien, Dhikshith Gajulapalli, Eric Xia

Potential Business Impact:

Makes AI learn new tasks faster and better.

Business Areas:
Machine Learning Artificial Intelligence, Data and Analytics, Software

Results in interpretability suggest that large vision and language models learn implicit linear encodings when models are biased by in-context prompting. However, the existence of similar linear representations in more general adaptation regimes has not yet been demonstrated. In this work, we develop the concept of a task matrix, a linear transformation from a base to finetuned embedding state. We demonstrate that for vision and text models and ten different datasets, a base model augmented with a task matrix achieves results surpassing linear probes, sometimes approaching finetuned levels. Our results validate the existence of cross-layer linear encodings between pretrained and finetuned architectures. Moreover, we show that a data-based approximation for such encodings is both efficient and generalizable to multiple domains. We make our implementation publicly available.

Repos / Data Links

Page Count
18 pages

Category
Computer Science:
Machine Learning (CS)