Score: 2

QLENS: Towards A Quantum Perspective of Language Transformers

Published: October 13, 2025 | arXiv ID: 2510.11963v1

By: Aditya Gupta, Kirandeep Kaur, Vinayak Gupta

BigTech Affiliations: University of Washington

Potential Business Impact:

Explains how computer language models think like physics.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

In natural language processing, current methods for understanding Transformers are successful at identifying intermediate predictions during a model's inference. However, these approaches function as limited diagnostic checkpoints, lacking a mathematical framework for mechanistically modeling how each layer facilitates transitions between these evolving states. This interpretability gap and past successes of interdisciplinary outlooks inspire us to turn to physics in search of a descriptive mathematical framework for Transformers. We observe that language models are intrinsically probabilistic, an attribute that is echoed in the core postulates of quantum mechanics. This parallel inspires us to translate insights from this discipline to that of natural language processing. Towards this objective, we propose QLENS a novel attempt to develop a physics-based perspective on the Transformer generation process. Under QLENS, a Transformer is studied by converting its latent activations into a state vector in a Hilbert space derived from the model's output units. This state subsequently evolves through hidden layers - reformulated as unitary operators and analogously defined Hamiltonians - during inference. The model's final probability distribution is obtained by applying the Born rule to the end state using a specific measurement operator. To demonstrate QLENS's potential, we conduct a proof-of-concept by probing a toy Transformer to investigate the influence of individual layers in a model's prediction trajectory. We present our work as a foundation for cross-domain insights to be leveraged towards a broader understanding of Transformers.

Country of Origin
🇺🇸 United States

Page Count
22 pages

Category
Computer Science:
Machine Learning (CS)