Home

eigenvector

An eigenvector of a square matrix A is a nonzero vector v such that Av = λv for some scalar λ, called an eigenvalue. In the context of a linear transformation, an eigenvector points in a direction that is stretched or compressed by the transformation by the factor λ, without changing its direction.

To compute eigenvectors, one first finds the eigenvalues by solving det(A − λI) = 0. For each eigenvalue

Key properties include that eigenvectors associated with distinct eigenvalues are linearly independent. If a matrix has

Special cases include the identity matrix, where every nonzero vector is an eigenvector with eigenvalue 1,

Applications and methods: eigenvectors reveal invariant directions under a transformation and underpin diagonalization, principal component analysis,

λ,
solve
(A
−
λI)v
=
0
to
obtain
the
corresponding
eigenvectors.
The
set
of
all
eigenvectors
associated
with
a
given
λ,
together
with
the
zero
vector,
forms
the
eigenspace
Eλ,
a
subspace
whose
dimension
is
the
geometric
multiplicity
of
λ.
a
full
set
of
linearly
independent
eigenvectors,
it
is
diagonalizable,
meaning
A
can
be
written
as
A
=
PDP⁻¹
with
D
diagonal
and
P
formed
from
the
eigenvectors.
In
this
case,
the
action
of
A
is
simply
to
scale
each
eigenvector
by
its
eigenvalue.
and
the
zero
matrix,
where
every
vector
is
an
eigenvector
with
eigenvalue
0.
vibration
analysis,
and
stability
studies.
Numerically,
power
iteration
approximates
a
dominant
eigenvector,
while
the
QR
algorithm
computes
a
full
set
of
eigenvectors
and
eigenvalues.