Home

eigenarvot

Eigenarvot, or eigenvalues, are scalars associated with a square matrix that describe how certain vectors are scaled under a linear transformation. If A is a square matrix and x is a nonzero vector such that Ax = λx, then λ is an eigenvalue of A and x is a corresponding eigenvector. The equation Ax = λx can be rearranged to (A − λI)x = 0, which leads to the characteristic equation det(A − λI) = 0. The eigenvectors form the eigenspaces, and the multiplicities of eigenvalues are classified as algebraic (multiplicity as a root of the characteristic polynomial) and geometric (dimension of the corresponding eigenspace).

Key properties include that a matrix is diagonalizable if it has n linearly independent eigenvectors. Real

Computation typically starts with solving the characteristic polynomial det(A − λI) = 0 to find eigenvalues, followed by

Example: For A = [[3, 1], [0, 2]], the eigenvalues are 3 and 2. An eigenvector for λ =

See also: eigenvectors, characteristic polynomial, diagonalization, spectral theorem.

symmetric
(or
Hermitian)
matrices
have
real
eigenvalues
and
can
be
orthogonally
diagonalized,
with
eigenvectors
that
are
mutually
orthogonal.
The
eigenvalues
of
A^T
are
the
same
as
those
of
A,
and
eigenvectors
corresponding
to
distinct
eigenvalues
are
linearly
independent.
solving
(A
−
λI)x
=
0
to
obtain
eigenvectors.
Numerically,
methods
such
as
the
QR
algorithm,
power
iteration
(which
finds
the
dominant
eigenvalue),
and
inverse
iteration
are
used
for
larger
matrices.
3
is
(1,
0),
and
for
λ
=
2
is
(1,
−1).
Applications
span
linear
differential
equations,
computer
graphics,
and
data
analysis
techniques
such
as
principal
component
analysis,
where
eigenvalues
indicate
the
variance
explained
by
principal
components.