LSq
LSq, short for least squares, is a broad class of optimization procedures used to estimate model parameters by minimizing the sum of squared differences between observed values and those predicted by a model. In its linear form, the model is y = Xβ + ε, where y is an n-vector of observations, X is an n×p design matrix, β is a p-vector of parameters, and ε is an error vector with mean zero. The objective S(β) = ||y − Xβ||^2 is minimized. If X has full column rank, the solution is β̂ = (X^T X)^{-1} X^T y. Ordinary least squares assumes uncorrelated errors with constant variance; when these assumptions fail, generalized least squares or weighted least squares can be used, replacing the objective with (y − Xβ)^T Ω^{-1} (y − Xβ) or applying weights.
Nonlinear LS extends the idea to models y ≈ f(X,β); the parameter vector is found by iterating to
History and usage: The method originated in the work of Adrien-Marie Legendre (1795) and Carl Friedrich Gauss
LSq is commonly denoted LSQ or LS, and is related to total least squares, generalized LS, and