QLENS: Towards A Quantum Perspective of Language Transformers
By: Aditya Gupta, Kirandeep Kaur, Vinayak Gupta
Potential Business Impact:
Explains how computer language models think like physics.
In natural language processing, current methods for understanding Transformers are successful at identifying intermediate predictions during a model's inference. However, these approaches function as limited diagnostic checkpoints, lacking a mathematical framework for mechanistically modeling how each layer facilitates transitions between these evolving states. This interpretability gap and past successes of interdisciplinary outlooks inspire us to turn to physics in search of a descriptive mathematical framework for Transformers. We observe that language models are intrinsically probabilistic, an attribute that is echoed in the core postulates of quantum mechanics. This parallel inspires us to translate insights from this discipline to that of natural language processing. Towards this objective, we propose QLENS a novel attempt to develop a physics-based perspective on the Transformer generation process. Under QLENS, a Transformer is studied by converting its latent activations into a state vector in a Hilbert space derived from the model's output units. This state subsequently evolves through hidden layers - reformulated as unitary operators and analogously defined Hamiltonians - during inference. The model's final probability distribution is obtained by applying the Born rule to the end state using a specific measurement operator. To demonstrate QLENS's potential, we conduct a proof-of-concept by probing a toy Transformer to investigate the influence of individual layers in a model's prediction trajectory. We present our work as a foundation for cross-domain insights to be leveraged towards a broader understanding of Transformers.
Similar Papers
Quantum-Enhanced Natural Language Generation: A Multi-Model Framework with Hybrid Quantum-Classical Architectures
Quantum Physics
Quantum computers write text, sometimes with unique skills.
Moving Beyond Next-Token Prediction: Transformers are Context-Sensitive Language Generators
Computation and Language
Explains how smart computer programs think.
Quantum Natural Language Processing: A Comprehensive Review of Models, Methods, and Applications
Computation and Language
Computers understand words better using tiny quantum tricks.