Eigenvalues / Eigenvectors
Table of contents
Characteristic Polynomial
For
Remeber
The characteristic polynomial of
where
Sometimes it is defined like...
The only difference is that this version guarantees the leading coefficient is positive, or monic:
Unlike the
Roots of the Characteristic Polynomial
This is literally the roots
Also called eigenvalues of
Eigenvalues and Eigenvector
Let
For the eigenvalue equation:
The satisfying non-zero vector
Intuition
Intuitively, it means that for some linear transformation represented by
In other words, this transformation
Some like to order the eigenvalues in descending order, and call them first, second, etc. eigenvalues.
Finding Eigenvalues via Characteristic Polynomial
Eigenvectors and the Kernel
The above equation can be rewritten as:
or equivalently:
With this homogenous system, we see that the non-trivial solution
Rank of
We saw that the eigenvector is a non-trivial element of the kernel.
Which means that the columns of
Therefore,
Determinant of
From here, we know that the determinant of the matrix is non-zero if and only if the matrix is full rank.
Matrix is invertible only when columns are linearly independent, but since it is not, determinant should be zero for the inverse to be undefined.
However, we just saw that
Hence:
Solving the Characteristic Polynomial
Because we know
we can use the characteristic polynomial to find the eigenvalues
Eigenvalues of a Transpose
However, the eigenvectors are not necessarily the same.
Defective Matrix
A square matrix
Linear Independence of Eigenvectors
Eigenvectors corresponding to distinct eigenvalues are linearly independent.
Therefore, eigenvectors with
Diagonalizable Matrix (Non-Defective Matrix)
We say that a matrix
It is saying that we can make
In other words, there exists an invertible matrix
This can be rewritten as:
Eigen-Decomposition
Any non-defective matrix
where
What?
If you rewrite the above equation as:
Let’s say
Keeping in mind that
Since
We see that each
Determinant of a Matrix Using Eigenvalues
What follows is a quick way to calculate the determinant of a matrix usig its eigenvalues.
A property of the determinant is that it is invariant under change of basis.
So for similar matrices
Therefore the determinant of a matrix is the product of its eigenvalues:
Trace of a Matrix Using Eigenvalues
Same logic applies to trace as it is also invariant under change of basis.
The trace of
Therefore the trace of a matrix is the sum of its eigenvalues:
Again, repetition counts.
Spectral Theorem
A symmetric matrix
Further, the eigenvectors of
Therefore the decomposition is:
Eigenspace
Eigenvectors are not unique, that is, any scalar multiple of an eigenvector is also an eigenvector.
These eigenvectors that share the same eigenvalue span a subspace of
Geometric Multiplicity
Let
Then the geometric multiplicity of
Algebraic multiplicity is the multiplicity of