Score: 2

Stay Unique, Stay Efficient: Preserving Model Personality in Multi-Task Merging

Published: December 1, 2025 | arXiv ID: 2512.01461v1

By: Kuangpu Guo , Yuhe Ding , Jian Liang and more

Potential Business Impact:

Combines AI skills without losing what it learned.

Business Areas:
Personalization Commerce and Shopping

Model merging has emerged as a promising paradigm for enabling multi-task capabilities without additional training. However, existing methods often experience substantial performance degradation compared with individually fine-tuned models, even on similar tasks, underscoring the need to preserve task-specific information. This paper proposes Decomposition, Thresholding, and Scaling (DTS), an approximation-based personalized merging framework that preserves task-specific information with minimal storage overhead. DTS first applies singular value decomposition to the task-specific information and retains only a small subset of singular values and vectors. It then introduces a novel thresholding strategy that partitions singular vector elements into groups and assigns a scaling factor to each group. To enable generalization to unseen tasks, we further extend DTS with a variant that fuses task-specific information in a data-free manner based on the semantic similarity of task characteristics. Extensive experiments demonstrate that DTS consistently outperforms state-of-the-art baselines while requiring only 1\% additional storage per task. Furthermore, experiments on unseen tasks show that the DTS variant achieves significantly better generalization performance. Our code is available at https://github.com/krumpguo/DTS.

Country of Origin
🇨🇳 China

Repos / Data Links

Page Count
18 pages

Category
Computer Science:
Machine Learning (CS)