Algorithms for Computing the Petz-Augustin Capacity
By: Chun-Neng Chu, Wei-Fu Tseng, Yen-Huan Li
Potential Business Impact:
Improves how we send secret messages with quantum computers.
We propose the first algorithms with non-asymptotic convergence guarantees for computing the Petz-Augustin capacity, which generalizes the channel capacity and characterizes the optimal error exponent in classical-quantum channel coding. This capacity can be equivalently expressed as the maximization of two generalizations of mutual information: the Petz-Rényi information and the Petz-Augustin information. To maximize the Petz-Rényi information, we show that it corresponds to a convex Hölder-smooth optimization problem, and hence the universal fast gradient method of Nesterov (2015), along with its convergence guarantees, readily applies. Regarding the maximization of the Petz-Augustin information, we adopt a two-layered approach: we show that the objective function is smooth relative to the negative Shannon entropy and can be efficiently optimized by entropic mirror descent; each iteration of entropic mirror descent requires computing the Petz-Augustin information, for which we propose a novel fixed-point algorithm and establish its contractivity with respect to the Thompson metric. Notably, this two-layered approach can be viewed as a generalization of the mirror-descent interpretation of the Blahut-Arimoto algorithm due to He et al. (2024).
Similar Papers
On the Computability of Finding Capacity-Achieving Codes
Information Theory
Finds ways to send messages reliably through noise.
From Bayesian Asymptotics to General Large-Scale MIMO Capacity
Information Theory
Improves wireless signals using math and better math.
Strong converse exponent of channel interconversion
Quantum Physics
Makes communication more reliable with less noise.