Score: 1

An Information-theoretic Multi-task Representation Learning Framework for Natural Language Understanding

Published: March 6, 2025 | arXiv ID: 2503.04667v1

By: Dou Hu , Lingwei Wei , Wei Zhou and more

Potential Business Impact:

Teaches computers to understand words better, even with mistakes.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

This paper proposes a new principled multi-task representation learning framework (InfoMTL) to extract noise-invariant sufficient representations for all tasks. It ensures sufficiency of shared representations for all tasks and mitigates the negative effect of redundant features, which can enhance language understanding of pre-trained language models (PLMs) under the multi-task paradigm. Firstly, a shared information maximization principle is proposed to learn more sufficient shared representations for all target tasks. It can avoid the insufficiency issue arising from representation compression in the multi-task paradigm. Secondly, a task-specific information minimization principle is designed to mitigate the negative effect of potential redundant features in the input for each task. It can compress task-irrelevant redundant information and preserve necessary information relevant to the target for multi-task prediction. Experiments on six classification benchmarks show that our method outperforms 12 comparative multi-task methods under the same multi-task settings, especially in data-constrained and noisy scenarios. Extensive experiments demonstrate that the learned representations are more sufficient, data-efficient, and robust.

Repos / Data Links

Page Count
11 pages

Category
Computer Science:
Computation and Language