Transformers in Pseudo-Random Number Generation: A Dual Perspective on Theory and Practice
By: Ran Li, Lingshu Zeng
Potential Business Impact:
Computers learn to make better random numbers.
Pseudo-random number generators (PRNGs) are high-nonlinear processes, and they are key blocks in optimization of Large language models. Transformers excel at processing complex nonlinear relationships. Thus it is reasonable to generate high-quality pseudo-random numbers based on transformers. In this paper, we explore this question from both theoretical and practical perspectives, highlighting the potential benefits and implications of Transformer in PRNGs. We theoretically demonstrate that decoder-only Transformer models with Chain-of-Thought can simulate both the Linear Congruential Generator (LCG) and Mersenne Twister (MT) PRNGs. Based on this, we conclude that the log-precision decoder-only Transformer can represent non-uniform $\text{AC}^0$. Our simulative theoretical findings are validated through experiments. The random numbers generated by Transformer-based PRNGs successfully pass the majority of NIST tests, whose heat maps exhibit clear statistical randomness. Finally, we assess their capability in prediction attacks.
Similar Papers
(How) Can Transformers Predict Pseudo-Random Numbers?
Machine Learning (CS)
Computers learn to guess secret number patterns.
Learning Pseudorandom Numbers with Transformers: Permuted Congruential Generators, Curricula, and Interpretability
Machine Learning (CS)
Computers learn secret number patterns better than humans.
Statistical Quality and Reproducibility of Pseudorandom Number Generators in Machine Learning technologies
Other Computer Science
Makes computer learning more fair and reliable.