Home

diagonalization

Diagonalization is the process of expressing a square matrix A as similar to a diagonal matrix. If there exists an invertible matrix P such that P^{-1}AP = D, where D is diagonal, then A is diagonalizable and D contains the eigenvalues of A on its diagonal. The columns of P are the corresponding eigenvectors of A.

A matrix is diagonalizable over a field F if and only if F^n has a basis consisting

Computation typically follows these steps: find the eigenvalues by solving det(A − λI) = 0; for each eigenvalue

Diagonalization simplifies many computations, notably powers and functions of A, since A^k = P D^k P^{-1} whenever

of
eigenvectors
of
A,
i.e.,
the
geometric
multiplicities
of
the
eigenvalues
sum
to
n.
Equivalently,
the
minimal
polynomial
of
A
splits
into
distinct
linear
factors
over
F.
In
particular,
a
matrix
with
n
distinct
eigenvalues
is
diagonalizable.
Over
the
real
numbers,
a
real
symmetric
matrix
is
always
diagonalizable
by
an
orthogonal
matrix:
there
exists
an
orthogonal
Q
with
Q^T
A
Q
=
D,
where
D
is
real
diagonal
(the
spectral
theorem).
λ,
solve
(A
−
λI)v
=
0
to
obtain
eigenvectors;
if
n
linearly
independent
eigenvectors
can
be
found,
form
P
with
these
eigenvectors
as
columns;
then
D
=
P^{-1}AP
is
diagonal
and
contains
the
eigenvalues
on
its
diagonal.
When
a
matrix
is
not
diagonalizable,
it
is
similar
to
a
Jordan
form
rather
than
a
diagonal
matrix.
A
=
P
D
P^{-1}
is
diagonalizable.