ratedistortiontheorie
Rate-distortion theory is a branch of information theory that analyzes the trade-off between the bitrate required to compress a source and the fidelity of its reconstructed version. It formalizes lossy compression by defining a distortion measure d(x, x̂), a source X with a probability law, and a reconstruction X̂ produced by a coder-decoder pair. For block codes of length n, the average distortion D = E[d(X^n, X̂^n)], and the rate R is the average number of bits per source symbol.
The central object is the rate-distortion function R(D), defined as the minimum attainable rate for a given
Shannon established the rate-distortion theorem: for any R > R(D) there exist codes of length n achieving
Extensions include rate distortion with side information at the decoder (Wyner–Ziv), and multi-terminal rate-distortion problems. Algorithmic
Applications span lossy data compression (audio, image, video), communications, and multimedia streaming, where one trades bitrate