LehmannSchefféSatz
The Lehmann-Scheffé theorem is a fundamental result in statistical inference that provides a way to find a specific type of best possible estimator. It states that if a family of probability distributions is dominated by a measure, and there exists an unbiased estimator of a parameter function that is a function of a complete sufficient statistic, then this estimator is the unique best (minimum variance) unbiased estimator. A complete statistic is one for which the expected value of any function of the statistic is zero only if that function is identically zero. A sufficient statistic is one that captures all the information about the parameter of interest from the sample. The theorem is crucial because it provides a constructive method for finding minimum variance unbiased estimators, provided a complete sufficient statistic exists. It is named after the statisticians Erich L. Lehmann and Henry Scheffé, who independently published the result. The theorem simplifies the search for optimal estimators by focusing on functions of complete sufficient statistics.