Score: 0

Structured Output Regularization: a framework for few-shot transfer learning

Published: October 9, 2025 | arXiv ID: 2510.08728v1

By: Nicolas Ewen, Jairo Diaz-Rodriguez, Kelly Ramsay

Potential Business Impact:

Helps computers learn from less medical pictures.

Business Areas:
Image Recognition Data and Analytics, Software

Traditional transfer learning typically reuses large pre-trained networks by freezing some of their weights and adding task-specific layers. While this approach is computationally efficient, it limits the model's ability to adapt to domain-specific features and can still lead to overfitting with very limited data. To address these limitations, we propose Structured Output Regularization (SOR), a simple yet effective framework that freezes the internal network structures (e.g., convolutional filters) while using a combination of group lasso and $L_1$ penalties. This framework tailors the model to specific data with minimal additional parameters and is easily applicable to various network components, such as convolutional filters or various blocks in neural networks enabling broad applicability for transfer learning tasks. We evaluate SOR on three few shot medical imaging classification tasks and we achieve competitive results using DenseNet121, and EfficientNetB4 bases compared to established benchmarks.

Country of Origin
🇨🇦 Canada

Page Count
20 pages

Category
Computer Science:
CV and Pattern Recognition