Score: 4

Scalable Evaluation and Neural Models for Compositional Generalization

Published: November 4, 2025 | arXiv ID: 2511.02667v1

By: Giacomo Camposampiero , Pietro Barbiero , Michael Hersche and more

BigTech Affiliations: IBM

Potential Business Impact:

Teaches computers to understand new things from old.

Business Areas:
Machine Learning Artificial Intelligence, Data and Analytics, Software

Compositional generalization-a key open challenge in modern machine learning-requires models to predict unknown combinations of known concepts. However, assessing compositional generalization remains a fundamental challenge due to the lack of standardized evaluation protocols and the limitations of current benchmarks, which often favor efficiency over rigor. At the same time, general-purpose vision architectures lack the necessary inductive biases, and existing approaches to endow them compromise scalability. As a remedy, this paper introduces: 1) a rigorous evaluation framework that unifies and extends previous approaches while reducing computational requirements from combinatorial to constant; 2) an extensive and modern evaluation on the status of compositional generalization in supervised vision backbones, training more than 5000 models; 3) Attribute Invariant Networks, a class of models establishing a new Pareto frontier in compositional generalization, achieving a 23.43% accuracy improvement over baselines while reducing parameter overhead from 600% to 16% compared to fully disentangled counterparts.

Country of Origin
πŸ‡¨πŸ‡­ πŸ‡ΊπŸ‡Έ Switzerland, United States

Repos / Data Links

Page Count
62 pages

Category
Computer Science:
Machine Learning (CS)