Home

Eigenvalues

An eigenvalue of a square matrix A is a scalar λ for which there exists a nonzero vector v satisfying Av = λv, where v is called an eigenvector associated with λ. Equivalently, λ is an eigenvalue if the linear transformation A has a nontrivial invariant direction, i.e., a nonzero vector that is mapped to a scalar multiple of itself.

To find eigenvalues, solve det(A − λI) = 0; the polynomial p_A(λ) = det(A − λI) is the characteristic polynomial,

Eigenvectors are obtained by solving (A − λI)v = 0 for each eigenvalue λ. The algebraic multiplicity of an

Two basic identities relate eigenvalues to matrix structure: trace(A) equals the sum of the eigenvalues (counting

Special cases: real symmetric matrices have real eigenvalues and orthogonal eigenvectors; more generally, Hermitian matrices have

Computation: exact solutions via the characteristic polynomial are feasible for small matrices; for larger or ill-conditioned

Applications span solving systems of differential equations and stability analysis, vibrations, quantum mechanics, PageRank, and dimensionality

and
its
roots
are
the
eigenvalues
(possibly
complex).
eigenvalue
is
its
multiplicity
as
a
root
of
the
characteristic
polynomial;
the
geometric
multiplicity
is
the
dimension
of
its
eigenspace.
Generally
geometric
multiplicity
≤
algebraic;
a
matrix
is
diagonalizable
if
the
sum
of
geometric
multiplicities
equals
the
dimension
of
the
matrix.
multiplicities),
and
det(A)
equals
the
product
of
the
eigenvalues.
Real
matrices
can
have
complex
eigenvalues,
which
occur
in
conjugate
pairs.
a
full
set
of
orthonormal
eigenvectors
(the
spectral
theorem).
problems,
numerical
methods
such
as
the
QR
algorithm,
power
iteration,
and
inverse
iteration
are
used.
Eigenvalues
can
be
sensitive
to
perturbations.
reduction
methods
like
principal
component
analysis,
among
others.