Eigenvalues and Eigenvectors
Subtopic: Eigentheory
Topic: Linear Algebra
Introduction
When we apply a linear transformation to a vector, the result is typically a new vector pointing in a different direction and with a different magnitude. However, certain special vectors behave remarkably simply: they emerge from the transformation pointing in exactly the same direction (or the exact opposite), merely stretched or compressed by some factor. These special vectors are called eigenvectors, and the scaling factors are called eigenvalues. Together, they reveal the intrinsic geometry of a linear transformation.
Consider spinning a globe on its axis. Every point on the globe moves in a circle—except for points on the axis itself. These axial points stay fixed, or at most get reflected through the center. The axis represents an eigenvector of the rotation: a direction that the transformation preserves. This example illustrates a profound principle: even complex transformations often have simple underlying structure, and eigenvectors expose that structure.
Eigenvalues and eigenvectors are not merely computational tools. They are the key to understanding the behavior of dynamical systems, the stability of equilibria, the principal axes of stress in materials, the modes of vibration in mechanical systems, and the fundamental frequencies of quantum states. In data science, they underlie principal component analysis, Google's PageRank algorithm, and spectral clustering. Mastering eigenvalues means gaining insight into the deepest structure of linear maps.
Definition
Let
The scalar
The equation
Geometric Interpretation
Geometrically, eigenvectors are the directions along which the linear transformation acts by pure scaling. Think of a matrix
For a
When all eigenvalues are positive, the transformation preserves orientation — it stretches or compresses without flipping. When an eigenvalue is negative, vectors along that eigenvector are reflected. When eigenvalues are complex, the transformation involves rotation, and there are no real directions that remain fixed (except in trivial cases).
The eigenvalues encode the transformation’s behavior along its principal axes, while the eigenvectors identify those axes. Eigenanalysis decomposes a potentially complicated transformation into simple scalings along well-chosen directions.
Finding Eigenvalues
To find the eigenvalues of a matrix
Factor out
This is a homogeneous linear system. For a nonzero solution
The left-hand side expands to a polynomial in
The Characteristic Polynomial
For an
The coefficient of
These relationships provide useful checks on eigenvalue computations and connect eigenvalues to other fundamental matrix properties.
Finding Eigenvectors
Once an eigenvalue
This amounts to finding the null space of the matrix
The set of all eigenvectors corresponding to a single eigenvalue
The dimension of this eigenspace is called the geometric multiplicity of
Worked Example
Consider the matrix:
We seek all eigenvalues and eigenvectors of
Step 1: Find the Characteristic Polynomial
Compute
The determinant is:
Step 2: Find the Eigenvalues
Set the characteristic polynomial equal to zero and solve:
Factoring:
The eigenvalues are
We can verify: the trace of
Step 3: Find the Eigenvectors
For
Both equations reduce to
For
Both equations give
Verification:
Fundamental Properties
Eigenvectors of Distinct Eigenvalues Are Independent
If
The proof proceeds by contradiction. Suppose the eigenvectors were dependent. Then we could write one as a linear combination of others. Applying
Eigenvalues of Special Matrices
For a triangular matrix (upper or lower), the eigenvalues are exactly the diagonal entries. This follows immediately from the fact that
For a symmetric matrix (
For an orthogonal matrix (
Eigenvalues Under Matrix Operations
If
λk is an eigenvalue ofAk (for any positive integerk ), with the same eigenvectorv .If
A is invertible, then1/λ is an eigenvalue ofA(−1) , with eigenvectorv .λ+c is an eigenvalue ofA+c*I , with eigenvectorv .
These properties follow directly from the eigenvalue equation. For example, if
Complex Eigenvalues
Even for a matrix with all real entries, the characteristic polynomial may have complex roots. When this happens, the complex eigenvalues occur in conjugate pairs: if
Complex eigenvalues indicate that the transformation involves rotation. There is no real direction that the transformation preserves (other than by accident). The real and imaginary parts of the complex eigenvector can be combined to describe the rotational behavior on a real invariant plane.
For example, the rotation matrix:
has eigenvalues
Applications and Connections
Eigenvalues and eigenvectors appear throughout mathematics and its applications. In differential equations, solutions to
where
In principal component analysis, eigenvectors of the covariance matrix give the principal directions of variation in the data, and eigenvalues measure the variance along each direction. This enables dimensionality reduction while preserving the dominant structure.
In quantum mechanics, physical observables are represented by operators, and measured values are eigenvalues. Eigenvectors represent states with definite values of the observable.
In Google’s PageRank algorithm, web pages are ranked according to the dominant eigenvector of a matrix representing link structure. The eigenvalue equation
Summary
An eigenvector of a matrix
Eigenvalues are found as roots of the characteristic polynomial
Key properties include: eigenvectors corresponding to distinct eigenvalues are linearly independent; the sum of eigenvalues equals the trace; the product equals the determinant; symmetric matrices have real eigenvalues and orthogonal eigenvectors. Complex eigenvalues indicate rotational behavior.
Eigenanalysis is one of the most powerful tools in applied mathematics, revealing the intrinsic structure of linear transformations and enabling applications from stability analysis to data science to quantum physics.