How does Transformer Learn Implicit Reasoning?
By: Jiaran Ye , Zijun Yao , Zhidian Huang and more
Potential Business Impact:
Teaches computers to think step-by-step.
Recent work suggests that large language models (LLMs) can perform multi-hop reasoning implicitly -- producing correct answers without explicitly verbalizing intermediate steps -- but the underlying mechanisms remain poorly understood. In this paper, we study how such implicit reasoning emerges by training transformers from scratch in a controlled symbolic environment. Our analysis reveals a three-stage developmental trajectory: early memorization, followed by in-distribution generalization, and eventually cross-distribution generalization. We find that training with atomic triples is not necessary but accelerates learning, and that second-hop generalization relies on query-level exposure to specific compositional structures. To interpret these behaviors, we introduce two diagnostic tools: cross-query semantic patching, which identifies semantically reusable intermediate representations, and a cosine-based representational lens, which reveals that successful reasoning correlates with the cosine-base clustering in hidden space. This clustering phenomenon in turn provides a coherent explanation for the behavioral dynamics observed across training, linking representational structure to reasoning capability. These findings provide new insights into the interpretability of implicit multi-hop reasoning in LLMs, helping to clarify how complex reasoning processes unfold internally and offering pathways to enhance the transparency of such models.
Similar Papers
Language models can learn implicit multi-hop reasoning, but only if they have lots of training data
Computation and Language
Computers learn to solve hard problems faster.
Implicit Reasoning in Transformers is Reasoning through Shortcuts
Computation and Language
Teaches computers to solve problems by copying patterns.
A Survey on Latent Reasoning
Computation and Language
Lets computers think faster without words.