Eigenspace
Explore the concept of eigenspaces for square matrices, distinguishing them from other matrix subspaces. Learn to compute eigenvalues and eigenvectors, understand their properties, and use numpy functions to find eigenspaces, enhancing your grasp of linear algebra in data science.
Eigenspace vs. matrix subspaces
Unlike the column, row, and null spaces, the eigenspace only exists for a square matrix. The eigenspace of a square matrix, , is the set of vectors that preserve their direction under the linear transformation matrix, . Furthermore, eigenspace can be complex in contrast to the other spaces of a matrix, which are always real.
Definition
Formally, an eigenspace of an matrix, , is defined as:
, where is an eigenvalue of and is the corresponding eigenvector of . Thus, eigenspace consists of all the eigenvectors of a matrix corresponding to an eigenvalue.
Example
Consider a matrix, . Furthermore, consider an eigenvalue, , and the corresponding eigenvector, . Notice that all multiples of (that is ) would also make valid eigenvectors corresponding to the same eigenvalue, such as . Therefore, by definition, the eigenspace corresponding to eigenvalue 6 will have the span of .
Note: There are infinitely many eigenvectors corresponding to each eigenvalue of a matrix.
Below is a visualization that shows how eigenvectors of a matrix preserve their direction under the linear transformation of that matrix. The red and blue vectors are eigenvectors corresponding to the eigenvalue, . These vectors kept their direction under the matrix A transformation. In contrast, a brown vector loses its direction.
Note: A matrix may have several distinct eigenvalues and an eigenspace corresponds to each of them.
Computing eigenvalues and eigenvectors in numpy
In numpy, we can compute eigenvalues and the basis of corresponding eigenspaces using the np.linalg.eig function. Let’s try it out for the matrix in the example above.
For the matrix in the example above, there are distinct eigenvalues and the corresponding eigenspaces. This isn’t, however, generalizable. A general matrix may not have distinct eigenvalues. An eigenspace can have more than one dimension, but the sum of the dimensions of all the eigenspaces of can’t be larger than .
Eigenspace vs. null space
The eigenvectors of a matrix, , satisfy for some scalar . We can rewrite this as , where is the identity matrix. In this way, we discover that the eigenspace of a matrix, , corresponding to the eigenvalue, , is the null space of the matrix .
Moreover, for a null space of to exist beyond zero-dimensional, must force singularity on . This implies
Note: A matrix, , may or may not be singular, but the matrix, , is always singular, where is an eigenvalue of .
Eigenvalues computation
To compute the eigenvalues of a matrix, , we need to solve the equation, for . For each value of , a corresponding eigenvector, , can be found by solving .
Example
The eigenvalues of can be computed as follows:
We can now use both values of to calculate a corresponding eigenvector. Let’s find an eigenvector, , that corresponds to . To that end, we have to find the solution of
Thus, is an eigenvector corresponding to the eigenvalue of . Furthermore, all multiples of are also eigenvectors corresponding to the eigenvalue of . All these eigenvectors make an eigenspace. Moreover, there’s another eigenspace corresponding to the eigenvalue of .
Eigenspace of a singular matrix
In the case that a matrix, , is singular, one of its eigenvalues must be equal to . Furthermore, the null space of is the same as the eigenspace of corresponding to eigenvalue of .
Properties of eigenvalues and eigenvectors
- The sum of eigenvalues is the trace of the matrix.
- The product of eigenvalues is the determinant of the matrix.
Note: That the product of eigenvalues is the determinant of the matrix corresponds to the statement that, for singular matrices, at least one eigenvalue equals zero.
- The eigenspaces corresponding to different eigenvalues are linearly independent. That is, the union of the basis of all eigenspaces is a linearly-independent set.
- The eigenvalues of a symmetric matrix are real.
- The eigenspaces of a symmetric matrix are mutually orthogonal.
- If is an eigenvalue of , then will be an eigenvalue of and the respective eigenspaces will be the same.
- The eigenvalues of and are the same.
- The eigenvalues of a triangular matrix are the diagonal entries of the matrix.