2biBitCount
2biBitCount is a term used in some discussions of data encoding to describe the average number of bits required to represent two-bit blocks after applying a coding scheme. It is not a standard metric with broad consensus, and its exact interpretation can vary between texts. In general, it is described as the expected code length per two-bit block and is used to compare the efficiency of different block-encoding strategies for data that naturally arrives as two-bit symbols.
Definition and calculation: Let blocks B take values among {00, 01, 10, 11} with probabilities p00, p01,
Relation and usage: 2biBitCount is used mainly in theoretical analyses of block coding and two-bit alphabets,
Example: Suppose a source emits 00 with probability 0.5, and 01, 10, 11 with probabilities 0.1667 each.