Functional Information Decomposition: A First-Principles Approach to Analyzing Functional Relationships
By: Clifford Bohm , Vincent R. Ragusa , Arend Hintze and more
Potential Business Impact:
Unlocks how parts work together to create a whole.
Information theory, originating from Shannon's work on communication systems, has become a fundamental tool across neuroscience, genetics, physics, and machine learning. However, the application of information theory is often limited to the simplest case: mutual information between two variables. A central challenge in extending information theory to multivariate systems is decomposition: understanding how the information that multiple variables collectively provide about a target can be broken down into the distinct contributions that are assignable to individual variables or their interactions. To restate the problem clearly, what is sought after is a decomposition of the mutual information between a set of inputs (or parts) and an output (or whole). In this work, we introduce Functional Information Decomposition (FID) a new approach to information decomposition that differs from prior methods by operating on complete functional relationships rather than statistical correlations, enabling precise quantification of independent and synergistic contributions.
Similar Papers
Multivariate Partial Information Decomposition: Constructions, Inconsistencies, and Alternative Measures
Information Theory
Finds how many things work together.
The Whole Is Less than the Sum of Parts: Subsystem Inconsistency in Partial Information Decomposition
Information Theory
Fixes how we measure information in complex systems.
The Whole Is Less than the Sum of Parts: Subsystem Inconsistency in Partial Information Decomposition
Information Theory
Fixes how we measure information sharing.