Score: 0

Sparse Optimization for Transfer Learning: A L0-Regularized Framework for Multi-Source Domain Adaptation

Published: April 7, 2025 | arXiv ID: 2504.04812v1

By: Chenqi Gong, Hu Yang

Potential Business Impact:

Makes computer learning faster and more accurate.

Business Areas:
A/B Testing Data and Analytics

This paper explores transfer learning in heterogeneous multi-source environments with distributional divergence between target and auxiliary domains. To address challenges in statistical bias and computational efficiency, we propose a Sparse Optimization for Transfer Learning (SOTL) framework based on L0-regularization. The method extends the Joint Estimation Transferred from Strata (JETS) paradigm with two key innovations: (1) L0-constrained exact sparsity for parameter space compression and complexity reduction, and (2) refining optimization focus to emphasize target parameters over redundant ones. Simulations show that SOTL significantly improves both estimation accuracy and computational speed, especially under adversarial auxiliary domain conditions. Empirical validation on the Community and Crime benchmarks demonstrates the statistical robustness of the SOTL method in cross-domain transfer.

Country of Origin
🇨🇳 China

Page Count
19 pages

Category
Statistics:
Machine Learning (Stat)