Information Theoretic Perspective on Representation Learning
By: Deborah Pereg
Potential Business Impact:
Helps computers learn better from data.
An information-theoretic framework is introduced to analyze last-layer embedding, focusing on learned representations for regression tasks. We define representation-rate and derive limits on the reliability with which input-output information can be represented as is inherently determined by the input-source entropy. We further define representation capacity in a perturbed setting, and representation rate-distortion for a compressed output. We derive the achievable capacity, the achievable representation-rate, and their converse. Finally, we combine the results in a unified setting.
Similar Papers
Redefining Information Theory: From Quantization and Rate--Distortion to a Foundational Mathematical Framework
Information Theory
Makes all math a simple code of 0s and 1s.
A Theoretical Framework for Rate-Distortion Limits in Learned Image Compression
Information Theory
Makes pictures smaller with less lost detail.
Information Physics of Intelligence: Unifying Logical Depth and Entropy under Thermodynamic Constraints
Information Theory
Makes AI smarter and use less energy.