Score: 1

Memorizing Long-tail Data Can Help Generalization Through Composition

Published: October 18, 2025 | arXiv ID: 2510.16322v1

By: Mo Zhou, Haoyang Ma, Rong Ge

BigTech Affiliations: University of Washington

Potential Business Impact:

Helps computers learn rare things by remembering.

Business Areas:
Machine Learning Artificial Intelligence, Data and Analytics, Software

Deep learning has led researchers to rethink the relationship between memorization and generalization. In many settings, memorization does not hurt generalization due to implicit regularization and may help by memorizing long-tailed examples. In this paper, we consider the synergy between memorization and simple composition -- the ability to make correct prediction on a combination of long-tailed features. Theoretically, we show that for a linear setting, memorization together with composition can help the model make correct predictions on rare test examples that require a combination of long-tailed features, even if such combinations were never observed in the training data. Experiments on neural network architecture on simple data show that the theoretical insight extends beyond the linear setting, and we further observe that the composition capability of the model depends on its architecture.

Country of Origin
🇺🇸 United States

Page Count
30 pages

Category
Computer Science:
Machine Learning (CS)