Multi-task Learning for Heterogeneous Multi-source Block-Wise Missing Data
By: Yang Sui , Qi Xu , Yang Bai and more
Potential Business Impact:
Helps computers learn many things at once.
Multi-task learning (MTL) has emerged as an imperative machine learning tool to solve multiple learning tasks simultaneously and has been successfully applied to healthcare, marketing, and biomedical fields. However, in order to borrow information across different tasks effectively, it is essential to utilize both homogeneous and heterogeneous information. Among the extensive literature on MTL, various forms of heterogeneity are presented in MTL problems, such as block-wise, distribution, and posterior heterogeneity. Existing methods, however, struggle to tackle these forms of heterogeneity simultaneously in a unified framework. In this paper, we propose a two-step learning strategy for MTL which addresses the aforementioned heterogeneity. First, we impute the missing blocks using shared representations extracted from homogeneous source across different tasks. Next, we disentangle the mappings between input features and responses into a shared component and a task-specific component, respectively, thereby enabling information borrowing through the shared component. Our numerical experiments and real-data analysis from the ADNI database demonstrate the superior MTL performance of the proposed method compared to other competing methods.
Similar Papers
Multi-task Learning for Heterogeneous Data via Integrating Shared and Task-Specific Encodings
Machine Learning (Stat)
Helps doctors predict cancer growth better.
Tensorized Multi-Task Learning for Personalized Modeling of Heterogeneous Individuals with High-Dimensional Data
Machine Learning (CS)
Helps computers learn about different groups of people.
Robust-Multi-Task Gradient Boosting
Machine Learning (CS)
Helps computers learn from many tasks, even bad ones.