Score: 0

Simplifying Multi-Task Architectures Through Task-Specific Normalization

Published: December 23, 2025 | arXiv ID: 2512.20420v1

By: Mihai Suteu, Ovidiu Serban

Potential Business Impact:

Makes AI learn many things better with less effort.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Multi-task learning (MTL) aims to leverage shared knowledge across tasks to improve generalization and parameter efficiency, yet balancing resources and mitigating interference remain open challenges. Architectural solutions often introduce elaborate task-specific modules or routing schemes, increasing complexity and overhead. In this work, we show that normalization layers alone are sufficient to address many of these challenges. Simply replacing shared normalization with task-specific variants already yields competitive performance, questioning the need for complex designs. Building on this insight, we propose Task-Specific Sigmoid Batch Normalization (TS$σ$BN), a lightweight mechanism that enables tasks to softly allocate network capacity while fully sharing feature extractors. TS$σ$BN improves stability across CNNs and Transformers, matching or exceeding performance on NYUv2, Cityscapes, CelebA, and PascalContext, while remaining highly parameter-efficient. Moreover, its learned gates provide a natural framework for analyzing MTL dynamics, offering interpretable insights into capacity allocation, filter specialization, and task relationships. Our findings suggest that complex MTL architectures may be unnecessary and that task-specific normalization offers a simple, interpretable, and efficient alternative.

Country of Origin
🇬🇧 United Kingdom

Page Count
22 pages

Category
Computer Science:
Machine Learning (CS)