The Molecular Structure of Thought: Mapping the Topology of Long Chain-of-Thought Reasoning
By: Qiguang Chen , Yantao Du , Ziniu Li and more
Potential Business Impact:
Teaches computers to think step-by-step better.
Large language models (LLMs) often fail to learn effective long chain-of-thought (Long CoT) reasoning from human or non-Long-CoT LLMs imitation. To understand this, we propose that effective and learnable Long CoT trajectories feature stable molecular-like structures in unified view, which are formed by three interaction types: Deep-Reasoning (covalent-like), Self-Reflection (hydrogen-bond-like), and Self-Exploration (van der Waals-like). Analysis of distilled trajectories reveals these structures emerge from Long CoT fine-tuning, not keyword imitation. We introduce Effective Semantic Isomers and show that only bonds promoting fast entropy convergence support stable Long CoT learning, while structural competition impairs training. Drawing on these findings, we present Mole-Syn, a distribution-transfer-graph method that guides synthesis of effective Long CoT structures, boosting performance and RL stability across benchmarks.
Similar Papers
Deconstructing Long Chain-of-Thought: A Structured Reasoning Optimization Framework for Long CoT Distillation
Artificial Intelligence
Teaches computers to think better, step-by-step.
Towards Reasoning Era: A Survey of Long Chain-of-Thought for Reasoning Large Language Models
Artificial Intelligence
Makes computers think deeper to solve hard problems.
Mol-R1: Towards Explicit Long-CoT Reasoning in Molecule Discovery
Computation and Language
Helps computers discover new medicines faster.