High signal-to-noise ratio asymptotics of entropy-constrained Gaussian channel capacity
By: Adway Girish, Shlomo Shamai, Emre Telatar
We study the input-entropy-constrained Gaussian channel capacity problem in the asymptotic high signal-to-noise ratio (SNR) regime. We show that the capacity-achieving distribution as SNR goes to infinity is given by a discrete Gaussian distribution supported on a scaled integer lattice. Further, we show that the gap between the input entropy and the capacity decreases to zero exponentially in SNR, and characterize this exponent.
Similar Papers
An Improved Lower Bound on Cardinality of Support of the Amplitude-Constrained AWGN Channel
Information Theory
Makes data send faster with less noise.
On entropy-constrained Gaussian channel capacity via the moment problem
Information Theory
Makes information travel faster with less power.
From Bayesian Asymptotics to General Large-Scale MIMO Capacity
Information Theory
Improves wireless signals using math and better math.