eigenvector
An eigenvector of a square matrix A is a nonzero vector v such that Av = λv for some scalar λ, called an eigenvalue. In the context of a linear transformation, an eigenvector points in a direction that is stretched or compressed by the transformation by the factor λ, without changing its direction.
To compute eigenvectors, one first finds the eigenvalues by solving det(A − λI) = 0. For each eigenvalue
Key properties include that eigenvectors associated with distinct eigenvalues are linearly independent. If a matrix has
Special cases include the identity matrix, where every nonzero vector is an eigenvector with eigenvalue 1,
Applications and methods: eigenvectors reveal invariant directions under a transformation and underpin diagonalization, principal component analysis,