Extrapolation by Association: Length Generalization Transfer in Transformers
By: Ziyang Cai , Nayoung Lee , Avi Schwarzschild and more
Potential Business Impact:
Helps computers learn longer tasks from similar ones.
Transformer language models have demonstrated impressive generalization capabilities in natural language domains, yet we lack a fine-grained understanding of how such generalization arises. In this paper, we investigate length generalization--the ability to extrapolate from shorter to longer inputs--through the lens of \textit{task association}. We find that length generalization can be \textit{transferred} across related tasks. That is, training a model with a longer and related auxiliary task can lead it to generalize to unseen and longer inputs from some other target task. We demonstrate this length generalization transfer across diverse algorithmic tasks, including arithmetic operations, string transformations, and maze navigation. Our results show that transformer models can inherit generalization capabilities from similar tasks when trained jointly. Moreover, we observe similar transfer effects in pretrained language models, suggesting that pretraining equips models with reusable computational scaffolding that facilitates extrapolation in downstream settings. Finally, we provide initial mechanistic evidence that length generalization transfer correlates with the re-use of the same attention heads between the tasks. Together, our findings deepen our understanding of how transformers generalize to out-of-distribution inputs and highlight the compositional reuse of inductive structure across tasks.
Similar Papers
Transformers Provably Learn Chain-of-Thought Reasoning with Length Generalization
Machine Learning (CS)
AI learns to solve harder problems with longer thinking.
Quantitative Bounds for Length Generalization in Transformers
Machine Learning (CS)
Makes AI understand longer text by training it more.
On Vanishing Variance in Transformer Length Generalization
Machine Learning (CS)
Makes AI better at remembering longer stories.