FisherInformationsfunktion
Fisher information, named after Ronald A. Fisher, is a measure of the amount of information that an observable random variable X carries about an unknown parameter θ of a statistical model.
For a parametric family with density f(x; θ), the score is U(θ) = ∂/∂θ log f(X; θ). The Fisher information
For a vector parameter θ ∈ R^k, the information is the k×k Fisher information matrix I(θ) with entries
For independent and identically distributed samples X1,...,Xn, the total information is I_n(θ) = n I_1(θ). This additive
Common examples include Bernoulli(p) with I(p) = n/[p(1−p)], and a normal model with known variance σ^2 where
A central result is the Cramér-Rao bound: Var(θ̂) ≥ I(θ)^{-1} for unbiased estimators. In the multivariate case,
Historically, Fisher introduced the concept in the 1920s as part of estimation theory. It remains a foundational
See also: likelihood, score, Cramér–Rao bound, asymptotic normality.