Score: 0

Domain-Generalization to Improve Learning in Meta-Learning Algorithms

Published: August 13, 2025 | arXiv ID: 2508.09418v1

By: Usman Anjum , Chris Stockman , Cat Luong and more

Potential Business Impact:

Teaches computers to learn new things fast.

This paper introduces Domain Generalization Sharpness-Aware Minimization Model-Agnostic Meta-Learning (DGS-MAML), a novel meta-learning algorithm designed to generalize across tasks with limited training data. DGS-MAML combines gradient matching with sharpness-aware minimization in a bi-level optimization framework to enhance model adaptability and robustness. We support our method with theoretical analysis using PAC-Bayes and convergence guarantees. Experimental results on benchmark datasets show that DGS-MAML outperforms existing approaches in terms of accuracy and generalization. The proposed method is particularly useful for scenarios requiring few-shot learning and quick adaptation, and the source code is publicly available at GitHub.

Country of Origin
🇺🇸 United States

Page Count
27 pages

Category
Computer Science:
Machine Learning (CS)