Score: 1

The Mean-Field Dynamics of Transformers

Published: December 1, 2025 | arXiv ID: 2512.01868v1

By: Philippe Rigollet

Potential Business Impact:

Makes AI understand long texts better by grouping ideas.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

We develop a mathematical framework that interprets Transformer attention as an interacting particle system and studies its continuum (mean-field) limits. By idealizing attention continuous on the sphere, we connect Transformer dynamics to Wasserstein gradient flows, synchronization models (Kuramoto), and mean-shift clustering. Central to our results is a global clustering phenomenon whereby tokens cluster asymptotically after long metastable states where they are arranged into multiple clusters. We further analyze a tractable equiangular reduction to obtain exact clustering rates, show how commonly used normalization schemes alter contraction speeds, and identify a phase transition for long-context attention. The results highlight both the mechanisms that drive representation collapse and the regimes that preserve expressive, multi-cluster structure in deep attention architectures.

Page Count
19 pages

Category
Computer Science:
Machine Learning (CS)