Score: 1

Provable In-Context Vector Arithmetic via Retrieving Task Concepts

Published: August 13, 2025 | arXiv ID: 2508.09820v1

By: Dake Bu , Wei Huang , Andi Han and more

Potential Business Impact:

Helps computers learn facts by practicing

In-context learning (ICL) has garnered significant attention for its ability to grasp functions/tasks from demonstrations. Recent studies suggest the presence of a latent task/function vector in LLMs during ICL. Merullo et al. (2024) showed that LLMs leverage this vector alongside the residual stream for Word2Vec-like vector arithmetic, solving factual-recall ICL tasks. Additionally, recent work empirically highlighted the key role of Question-Answer data in enhancing factual-recall capabilities. Despite these insights, a theoretical explanation remains elusive. To move one step forward, we propose a theoretical framework building on empirically grounded hierarchical concept modeling. We develop an optimization theory, showing how nonlinear residual transformers trained via gradient descent on cross-entropy loss perform factual-recall ICL tasks via vector arithmetic. We prove 0-1 loss convergence and show the strong generalization, including robustness to concept recombination and distribution shifts. These results elucidate the advantages of transformers over static embedding predecessors. Empirical simulations corroborate our theoretical insights.

Country of Origin
🇭🇰 Hong Kong

Repos / Data Links

Page Count
56 pages

Category
Computer Science:
Machine Learning (CS)