Multiplicative Turing Ensembles, Pareto's Law, and Creativity
By: Alexander Kolpakov, Aidan Rocke
Potential Business Impact:
Explains how computer programs grow and change.
We study integer-valued multiplicative dynamics driven by i.i.d. prime multipliers and connect their macroscopic statistics to universal codelengths. We introduce the Multiplicative Turing Ensemble (MTE) and show how it arises naturally - though not uniquely - from ensembles of probabilistic Turing machines. Our modeling principle is variational: taking Elias' Omega codelength as an energy and imposing maximum entropy constraints yields a canonical Gibbs prior on integers and, by restriction, on primes. Under mild tail assumptions, this prior induces exponential tails for log-multipliers (up to slowly varying corrections), which in turn generate Pareto tails for additive gaps. We also prove time-average laws for the Omega codelength along MTE trajectories. Empirically, on Debian and PyPI package size datasets, a scaled Omega prior achieves the lowest KL divergence against codelength histograms. Taken together, the theory-data comparison suggests a qualitative split: machine-adapted regimes (Gibbs-aligned, finite first moment) exhibit clean averaging behavior, whereas human-generated complexity appears to sit beyond this regime, with tails heavy enough to produce an unbounded first moment, and therefore no averaging of the same kind.
Similar Papers
Multiplicative Turing Ensembles, Pareto's Law, and Creativity
Information Theory
Finds patterns in computer code sizes.
Benford's Law from Turing Ensembles and Integer Partitions
Information Theory
Explains why numbers start with 1 more often.
Models of Heavy-Tailed Mechanistic Universality
Machine Learning (Stat)
Makes computers learn better by understanding data patterns.