Super-Linear Growth of the Capacity-Achieving Input Support for the Amplitude-Constrained AWGN Channel
By: Haiyang Wang
Potential Business Impact:
Makes signals stronger with less noise.
We study the growth of the support size of the capacity-achieving input distribution for the amplitude-constrained additive white Gaussian noise (AWGN) channel. While it is known since Smith (1971) that the optimal input is discrete with finitely many mass points, tight bounds on the number of support points $K(A)$ as the amplitude constraint $A$ increases remain open. Building on recent work by Dytso \emph{et al.} (2019) and Mattingly \emph{et al.} (2018), we derive a new analytical lower bound showing that $K(A)$ grows super-linearly in $A$. Our approach combines total-variation convergence of the output distribution to the uniform law with quantitative limits on Gaussian mixture approximation.
Similar Papers
Phase Transitions of the Additive Uniform Noise Channel with Peak Amplitude and Cost Constraint
Information Theory
Makes data sent through noisy channels clearer.
Volume-Based Lower Bounds to the Capacity of the Gaussian Channel Under Pointwise Additive Input Constraints
Information Theory
Improves how much data can be sent through noisy signals.
The Shannon Upper Bound for the Error Exponent
Information Theory
Makes sure messages get through noisy signals.