limitEntropy
LimitEntropy is a term used in information theory and probability to describe the asymptotic behavior of entropy in a sequence of random variables or probability distributions. It is most commonly defined as the entropy rate of a stochastic process: for a process X = {Xn} taking values in a finite alphabet, the limit entropy h is the limit as n grows of (1/n) times the joint entropy H(X1, ..., Xn), provided the limit exists.
Formally, h = lim_{n→∞} (1/n) H(X1, ..., Xn). For stationary processes, this limit exists (by subadditivity and Fekete’s
LimitEntropy can also refer to the limiting entropy of a sequence of probability distributions p_n on a
Applications of limitEntropy include characterizing the informational efficiency of data sources, informing compression schemes, and providing
See also: Shannon entropy, entropy rate, information rate, ergodic theory, coding theory.