A path to natural language through tokenisation and transformers
By: David S. Berman, Alexander G. Stapleton
Potential Business Impact:
Makes computers understand words better by breaking them down.
Natural languages exhibit striking regularities in their statistical structure, including notably the emergence of Zipf's and Heaps' laws. Despite this, it remains broadly unclear how these properties relate to the modern tokenisation schemes used in contemporary transformer models. In this note, we analyse the information content (as measured by the Shannon entropy) of various corpora under the assumption of a Zipfian frequency distribution, and derive a closed-form expression for the slot entropy expectation value. We then empirically investigate how byte--pair encoding (BPE) transforms corpus statistics, showing that recursive applications of BPE drive token frequencies toward a Zipfian power law while inducing a characteristic growth pattern in empirical entropy. Utilizing the ability of transformers to learn context dependent token probability distributions, we train language models on corpora tokenised at varying BPE depths, revealing that the model predictive entropies increasingly agree with Zipf-derived predictions as the BPE depth increases. Attention-based diagnostics further indicate that deeper tokenisation reduces local token dependencies, bringing the empirical distribution closer to the weakly dependent (near IID) regime. Together, these results clarify how BPE acts not only as a compression mechanism but also as a statistical transform that reconstructs key informational properties of natural language.
Similar Papers
Entropy-Driven Pre-Tokenization for Byte-Pair Encoding
Computation and Language
Helps computers understand Chinese words better.
Parity-Aware Byte-Pair Encoding: Improving Cross-lingual Fairness in Tokenization
Computation and Language
Makes computer language tools fair for all languages.
Know Your Limits: Entropy Estimation Modeling for Compression and Generalization
Computation and Language
Makes computers understand and write language better.