leastsquares
The least squares method is a standard approach for estimating unknown parameters by minimizing the sum of squared deviations between observed values and those predicted by a model. It is widely used in data fitting, regression analysis, and numerical approximation. Historical note: associated with Adrien-Marie Legendre (1805) and Carl Friedrich Gauss; used to fit polynomials.
Suppose y ∈ R^n, X ∈ R^{n×p}, model y ≈ Xβ, with β ∈ R^p. The ordinary least squares estimate minimizes
Variants include weighted and generalized formulations. In weighted least squares with a positive definite weight matrix
Nonlinear least squares extend the idea to models where the predictions are nonlinear in the parameters. One
Properties and scope include assumptions about error structure, such as uncorrelated and homoscedastic errors, with normality