informationtheory
Information theory is a branch of applied mathematics and electrical engineering that studies the quantification, storage, and communication of information. It provides a mathematical framework for measuring information and for understanding the limits of data compression and reliable communication. The field emerged in the 1940s, drawing on Ralph Hartley’s idea that information can be measured as a function of the logarithm of the number of possible messages, and was formalized by Claude E. Shannon. Shannon introduced probabilistic models of information sources and channels and established fundamental limits known as coding theorems.
The central concepts include entropy, mutual information, and channel capacity. For a discrete X with distribution
Applications include data compression (examples such as MP3 and JPEG), error-correcting codes, and the design of
---