RobuMTL: Enhancing Multi-Task Learning Robustness Against Weather Conditions
By: Tasneem Shaffee, Sherief Reda
Potential Business Impact:
Helps self-driving cars see in bad weather.
Robust Multi-Task Learning (MTL) is crucial for autonomous systems operating in real-world environments, where adverse weather conditions can severely degrade model performance and reliability. In this paper, we introduce RobuMTL, a novel architecture designed to adaptively address visual degradation by dynamically selecting task-specific hierarchical Low-Rank Adaptation (LoRA) modules and a LoRA expert squad based on input perturbations in a mixture-of-experts fashion. Our framework enables adaptive specialization based on input characteristics, improving robustness across diverse real-world conditions. To validate our approach, we evaluated it on the PASCAL and NYUD-v2 datasets and compared it against single-task models, standard MTL baselines, and state-of-the-art methods. On the PASCAL benchmark, RobuMTL delivers a +2.8% average relative improvement under single perturbations and up to +44.4% under mixed weather conditions compared to the MTL baseline. On NYUD-v2, RobuMTL achieves a +9.7% average relative improvement across tasks. The code is available at GitHub.
Similar Papers
VLM-Augmented Degradation Modeling for Image Restoration Under Adverse Weather Conditions
CV and Pattern Recognition
Clears blurry pictures from rain, fog, and snow.
Robust-Multi-Task Gradient Boosting
Machine Learning (CS)
Helps computers learn from many tasks, even bad ones.
From Snow to Rain: Evaluating Robustness, Calibration, and Complexity of Model-Based Robust Training
CV and Pattern Recognition
Makes AI see signs in snow and rain.