Closed-Form Information Capacity of Canonical Signaling Models
By: Michał Komorowski
Potential Business Impact:
Measures how well cells understand signals.
We employ a unified framework for computing the information capacity of biological signaling systems using Fisher Information. By deriving closed-form or easily computable information capacity formulas, we quantify how well different signaling models, including binomial, multinomial, Poisson, Gaussian, and Gamma distributions, can discriminate among input signals. These expressions clarify how key features such as signal range, noise scaling, pathway length, and receivers' diversity shape the theoretical limits of sensing. In particular, we show how signal-to-noise ratio and fold-change sensitivity arise naturally within the Fisher formalism, and how signal degradation in cascades imposes linear information loss. Our results provide intuitive, analytically grounded tools to benchmark and guide the analysis of real signaling systems, without requiring computationally expensive mutual information estimation. While motivated by cellular communication, the framework generalizes to any system where noisy input-output relationships constrain transmission fidelity, including synthetic biology, sensor networks, and engineered communication channels.
Similar Papers
From Bayesian Asymptotics to General Large-Scale MIMO Capacity
Information Theory
Improves wireless signals using math and better math.
Information-Theoretic Limits of Integrated Sensing and Communication with Finite Learning Capacity
Information Theory
AI helps devices share data and sense surroundings.
Intuitive dissection of the Gaussian information bottleneck method with an application to optimal prediction
Molecular Networks
Finds best way to remember important things.