Score: 0

Adjusted Kolmogorov Complexity of Binary Words with Empirical Entropy Normalization

Published: December 24, 2025 | arXiv ID: 2512.21193v1

By: Brani Vidakovic

Kolmogorov complexity of a finite binary word reflects both algorithmic structure and the empirical distribution of symbols appearing in the word. Words with symbol frequencies far from one half have smaller combinatorial richness and therefore appear less complex under the standard definition. In this paper an entropy-normalized complexity measure is introduced that divides the Kolmogorov complexity of a word by the empirical entropy of its observed distribution of zeros and ones. This adjustment isolates intrinsic descriptive complexity from the purely combinatorial effect of symbol imbalance. For Martin Löf random sequences under constructive exchangeable measures, the adjusted complexity grows linearly and converges to one. A pathological construction shows that regularity of the underlying measure is essential. The proposed framework connects Kolmogorov complexity, empirical entropy, and randomness in a natural manner and suggests applications in randomness testing and in the analysis of structured binary data.

Category
Statistics:
Computation