Provable In-Context Vector Arithmetic via Retrieving Task Concepts
By: Dake Bu , Wei Huang , Andi Han and more
Potential Business Impact:
Helps computers learn facts by practicing
In-context learning (ICL) has garnered significant attention for its ability to grasp functions/tasks from demonstrations. Recent studies suggest the presence of a latent task/function vector in LLMs during ICL. Merullo et al. (2024) showed that LLMs leverage this vector alongside the residual stream for Word2Vec-like vector arithmetic, solving factual-recall ICL tasks. Additionally, recent work empirically highlighted the key role of Question-Answer data in enhancing factual-recall capabilities. Despite these insights, a theoretical explanation remains elusive. To move one step forward, we propose a theoretical framework building on empirically grounded hierarchical concept modeling. We develop an optimization theory, showing how nonlinear residual transformers trained via gradient descent on cross-entropy loss perform factual-recall ICL tasks via vector arithmetic. We prove 0-1 loss convergence and show the strong generalization, including robustness to concept recombination and distribution shifts. These results elucidate the advantages of transformers over static embedding predecessors. Empirical simulations corroborate our theoretical insights.
Similar Papers
Understanding Task Vectors in In-Context Learning: Emergence, Functionality, and Limitations
Machine Learning (CS)
Makes AI learn new things faster and better.
Adaptive Task Vectors for Large Language Models
Machine Learning (CS)
Helps computers learn new things faster and better.
A Simple Generalisation of the Implicit Dynamics of In-Context Learning
Machine Learning (CS)
Teaches computers to learn from examples without changing them.