informationdensity
Information density is a measure used in information theory and related disciplines to quantify how much information a unit of data carries. It can be considered per symbol, per character, per time interval, or per unit of bandwidth or space.
In formal terms, the self-information of a specific outcome x with probability p(x) is I(x) = -log2
In natural language, information density captures how much information each word or syllable conveys. It is
Applications include data compression, channel coding, text analytics, and natural language processing. The concept also underpins