eigenvalues
An eigenvalue of a square matrix A is a scalar λ for which there exists a nonzero vector v satisfying Av = λv, where v is called an eigenvector associated with λ. Equivalently, λ is an eigenvalue if the linear transformation A has a nontrivial invariant direction, i.e., a nonzero vector that is mapped to a scalar multiple of itself.
To find eigenvalues, solve det(A − λI) = 0; the polynomial p_A(λ) = det(A − λI) is the characteristic polynomial,
Eigenvectors are obtained by solving (A − λI)v = 0 for each eigenvalue λ. The algebraic multiplicity of an
Two basic identities relate eigenvalues to matrix structure: trace(A) equals the sum of the eigenvalues (counting
Special cases: real symmetric matrices have real eigenvalues and orthogonal eigenvectors; more generally, Hermitian matrices have
Computation: exact solutions via the characteristic polynomial are feasible for small matrices; for larger or ill-conditioned
Applications span solving systems of differential equations and stability analysis, vibrations, quantum mechanics, PageRank, and dimensionality