CramérRaoUntergrenze
The Cramér-Rao lower bound, often abbreviated as CRLB, is a fundamental result in statistical inference that provides a theoretical limit on the variance of any unbiased estimator for a parameter. Developed independently by Harald Cramér in 1945 and Rao in 1945, it establishes a minimum possible variance that no unbiased estimator can achieve for estimating a parameter of a probability distribution.
The CRLB is derived from the Fisher information, which quantifies the amount of information a random variable
If an unbiased estimator achieves this lower bound, it is called an efficient estimator. Efficient estimators