Home

DemingRegression

Deming regression is an errors-in-variables method for estimating the relationship between two variables when both are subject to measurement error. Named after W. Edwards Deming, it is also known as Deming least squares or orthogonal regression in some contexts. The approach is widely used in method comparison studies, calibration tasks, and analytical chemistry, where neither variable can be regarded as error-free.

The standard model assumes observed pairs (x_i, y_i) arise from true values linked by y = β0 +

β1 = [S_yy − λ S_xx + sqrt((S_yy − λ S_xx)^2 + 4 λ S_xy^2)] / (2 S_xy),

and the intercept is β0 = ȳ − β1 x̄. When λ = 1, the method reduces to orthogonal (total least

Assumptions include linearity, normal and independent measurement errors with known (or estimable) λ, and homoscedastic error variances.

β1
x,
with
independent
measurement
errors
ε_xi
and
ε_yi
having
variances
σ_x^2
and
σ_y^2,
respectively.
A
key
quantity
is
the
error-variance
ratio
λ
=
σ_y^2
/
σ_x^2.
Let
x̄
and
ȳ
be
the
sample
means,
and
S_xx
=
Σ(x_i
−
x̄)^2,
S_yy
=
Σ(y_i
−
ȳ)^2,
S_xy
=
Σ(x_i
−
x̄)(y_i
−
ȳ).
The
slope
estimator
in
Deming
regression
is
squares)
regression.
The
method
yields
a
line
that
minimizes
the
sum
of
squared
orthogonal
distances
to
the
data
points,
rather
than
just
vertical
distances.
Limitations
include
sensitivity
to
misspecification
of
λ
and
to
outliers.