Score: 1

Shannon invariants: A scalable approach to information decomposition

Published: April 22, 2025 | arXiv ID: 2504.15779v1

By: Aaron J. Gutknecht , Fernando E. Rosas , David A. Ehrlich and more

Potential Business Impact:

Unlocks how brains and AI learn and grow.

Business Areas:
Big Data Data and Analytics

Distributed systems, such as biological and artificial neural networks, process information via complex interactions engaging multiple subsystems, resulting in high-order patterns with distinct properties across scales. Investigating how these systems process information remains challenging due to difficulties in defining appropriate multivariate metrics and ensuring their scalability to large systems. To address these challenges, we introduce a novel framework based on what we call "Shannon invariants" -- quantities that capture essential properties of high-order information processing in a way that depends only on the definition of entropy and can be efficiently calculated for large systems. Our theoretical results demonstrate how Shannon invariants can be used to resolve long-standing ambiguities regarding the interpretation of widely used multivariate information-theoretic measures. Moreover, our practical results reveal distinctive information-processing signatures of various deep learning architectures across layers, which lead to new insights into how these systems process information and how this evolves during training. Overall, our framework resolves fundamental limitations in analyzing high-order phenomena and offers broad opportunities for theoretical developments and empirical analyses.

Country of Origin
🇬🇧 🇩🇪 United Kingdom, Germany

Page Count
16 pages

Category
Computer Science:
Information Theory