LogLikelihood
Loglikelihood, or log-likelihood, is the natural logarithm of the likelihood function used in statistics to quantify how probable a set of observed data is under a statistical model with fixed parameters. For data x = (x1, ..., xn) and a model with parameter θ, the likelihood is L(θ) = p(x1, ..., xn | θ). If observations are independent, L(θ) = ∏i p(xi | θ), and the log-likelihood is l(θ) = log L(θ) = ∑i log p(xi | θ). The log transformation turns products into sums, aiding numerical stability and ease of differentiation.
Maximizing the log-likelihood with respect to θ yields the maximum likelihood estimates (MLEs) of the parameters. Because
A common example is data from a normal distribution with unknown mean μ and known variance σ^2. The
Log-likelihood is central to model comparison and hypothesis testing. The likelihood ratio test uses the statistic
Notes: the log-likelihood depends on the assumed model; misspecification can bias results. For complex or non-iid