Neural networks leverage nominally quantum and post-quantum representations
By: Paul M. Riechers, Thomas J. Elliott, Adam S. Shai
Potential Business Impact:
Computers learn to guess future events like magic.
We show that deep neural networks, including transformers and RNNs, pretrained as usual on next-token prediction, intrinsically discover and represent beliefs over 'quantum' and 'post-quantum' low-dimensional generative models of their training data -- as if performing iterative Bayesian updates over the latent state of this world model during inference as they observe more context. Notably, neural nets easily find these representation whereas there is no finite classical circuit that would do the job. The corresponding geometric relationships among neural activations induced by different input sequences are found to be largely independent of neural-network architecture. Each point in this geometry corresponds to a history-induced probability density over all possible futures, and the relative displacement of these points reflects the difference in mechanism and magnitude for how these distinct pasts affect the future.
Similar Papers
Natural Quantization of Neural Networks
Quantum Physics
Makes computers learn better with quantum tricks.
Transforming Traditional Neural Networks into Neuromorphic Quantum-Cognitive Models: A Tutorial with Applications
Machine Learning (CS)
Lets anyone build brain-like AI on laptops.
Quantum Mechanics and Neural Networks
High Energy Physics - Theory
Makes quantum physics work like computer programs.