Home

reorthogonalization

Reorthogonalization is the process of restoring orthogonality among a set of vectors that has lost it due to numerical errors in finite-precision arithmetic. It is commonly used in numerical linear algebra algorithms that build orthogonal bases, such as Gram-Schmidt, QR factorization, and Krylov subspace methods (notably Lanczos and Arnoldi). In exact arithmetic the vectors remain orthogonal by construction; in floating-point arithmetic rounding errors cause inner products to deviate from zero, gradually eroding orthogonality and potentially affecting accuracy and stability.

Reorthogonalization compensates for this by reapplying an orthogonalization step one or more times. Full reorthogonalization re-orthogonalizes

Effectiveness and cost must be balanced: reorthogonalization improves numerical stability and accuracy of eigenvalue estimates in

Related topics include the Gram-Schmidt process, QR decomposition, the Lanczos method, the Arnoldi method, and Krylov

each
new
vector
against
all
previously
computed
basis
vectors.
Selective
(or
partial)
reorthogonalization
performs
orthogonalization
only
against
a
subset
of
vectors
or
only
when
certain
bounds
indicate
a
risk
of
loss
of
orthogonality.
Variants
of
Gram-Schmidt,
such
as
classical
Gram-Schmidt
with
reorthogonalization
or
modified
Gram-Schmidt
with
reorthogonalization,
are
commonly
used.
Householder-based
approaches
can
also
be
adapted
to
reorthogonalize.
Krylov
methods,
but
it
incurs
substantial
computational
and
memory
cost,
since
each
reorthogonalization
step
requires
many
dot
products
with
existing
vectors.
Therefore,
selective
reorthogonalization
is
often
preferred
in
large-scale
problems,
applied
when
the
loss
of
orthogonality
would
meaningfully
degrade
results.
The
choice
of
strategy
depends
on
the
algorithm,
problem
conditioning,
and
desired
accuracy.
subspace
techniques.