Empirical Coordination over Markov Channel with Independent Source
By: Mengyuan Zhao, Maël Le Treust, Tobias J. Oechtering
Potential Business Impact:
Helps computers send messages reliably through noisy connections.
We study joint source-channel coding over Markov channels through the empirical coordination framework. More specifically, we aim at determining the empirical distributions of source and channel symbols that can be induced by a coding scheme. We consider strictly causal encoders that generate channel inputs, without access to the past channel states, henceforth driving the current Markov state evolution. Our main result is the single-letter inner and outer bounds of the set of achievable joint distributions, coordinating all the symbols in the network. To establish the inner bound, we introduce a new notion of typicality, the input-driven Markov typicality, and develop its fundamental properties. Contrary to the classical block-Markov coding schemes that rely on blockwise independence for discrete memoryless channels, our analysis directly exploits the Markov channel structure and improves beyond the independence-based arguments.
Similar Papers
Optimal Source Coding of Markov Chains for Real-Time Remote Estimation
Information Theory
Makes sending information faster and more efficient.
One-Shot Broadcast Joint Source-Channel Coding with Codebook Diversity
Information Theory
Lets one message reach many people, even with bad signals.
Remote Channel Synthesis
Information Theory
Lets computers guess hidden information from noisy data.