Weight Weaving: Parameter Pooling for Data-Free Model Merging
By: Levy Chaves, Eduardo Valle, Sandra Avila
Potential Business Impact:
Combines AI models without needing more data.
Model merging provides a cost-effective and data-efficient combination of specialized deep neural networks through parameter integration. This technique leverages expert models across downstream tasks without requiring retraining. Most model merging approaches critically depend on scaling hyper-parameters $\lambda$, which weight each model's contribution globally or individually. Principled approaches for setting scaling factors without accessing any data (data-free) are scarce, often leading researchers to tune $\lambda$ using privileged data from the evaluation set, which is obviously unfeasible in practice. To address this limitation, we introduce Weight Weaving, a plug-and-play technique that pools model weights across $\lambda$ values search space using user-defined pooling functions, such as averaging, random selection, or even existing model merging methods. Our method demonstrates high modularity, imposing minimal constraints on the search space. It operates orthogonally to existing model merging methods and eliminates evaluation data requirements. We validate Weight Weaving across three ViT variants in three experimental setups: vision multi-task learning, vision continual learning, and domain generalization. Our method consistently improves the performance of several model merging methods, achieving average accuracy gains of up to 15.9 percentage points in a data-free setting.
Similar Papers
Dynamic Fisher-weighted Model Merging via Bayesian Optimization
Computation and Language
Combines AI models to do many jobs better.
Merge and Bound: Direct Manipulations on Weights for Class Incremental Learning
CV and Pattern Recognition
Teaches computers new things without forgetting old ones.
Merging Models on the Fly Without Retraining: A Sequential Approach to Scalable Continual Model Merging
Machine Learning (CS)
Combines AI skills without forgetting old ones.