Score: 0

Foundations of information theory for coding theory

Published: December 4, 2025 | arXiv ID: 2512.05316v1

By: El Mahdi Mouloua, Essaid Mohamed

Information theory is introduced in this lecture note with a particular emphasis on its relevance to algebraic coding theory. The document develops the mathematical foundations for quantifying uncertainty and information transmission by building upon Shannon's pioneering formulation of information, entropy, and channel capacity. Examples, including the binary symmetric channel, illustrate key concepts such as entropy, conditional entropy, mutual information, and the noisy channel model. Furthermore, the note describes the principles of maximum likelihood decoding and Shannon's noisy channel coding theorem, which characterizes the theoretical limits of reliable communication over noisy channels. Students and researchers seeking a connection between probabilistic frameworks of information theory and structural and algebraic techniques used in modern coding theory will find this work helpful.

Category
Computer Science:
Information Theory