Search⌘ K
AI Features

Eigenspace

Explore the concept of eigenspaces for square matrices, distinguishing them from other matrix subspaces. Learn to compute eigenvalues and eigenvectors, understand their properties, and use numpy functions to find eigenspaces, enhancing your grasp of linear algebra in data science.

Eigenspace vs. matrix subspaces

Unlike the column, row, and null spaces, the eigenspace only exists for a square matrix. The eigenspace of a square matrix, AA, is the set of vectors that preserve their direction under the linear transformation matrix, AA. Furthermore, eigenspace can be complex in contrast to the other spaces of a matrix, which are always real.

Definition

Formally, an eigenspace of an n×nn\times n matrix, AA, is defined as:

Eλ(A)={xAx=λx}E_\lambda(A)=\{\bold{x}|A\bold{x=\lambda x}\}

, where λ\lambda is an eigenvalue of AA and x\bold{x} is the corresponding eigenvector of AA. Thus, eigenspace consists of all the eigenvectors of a matrix corresponding to an eigenvalue.

Example

Consider a matrix, A=[3234]A=\begin{bmatrix}3 & 2\\3 & 4\end{bmatrix}. Furthermore, consider an eigenvalue, λ=6\lambda=6, and the corresponding eigenvector, x=[1015]\bold{x}=\begin{bmatrix}10\\15\end{bmatrix}. Notice that all multiples of x\bold{x} (that is span(x)span(\bold{x})) would also make valid eigenvectors corresponding to the same eigenvalue, such as 15x=[23]\frac{1}{5}\bold{x}=\begin{bmatrix}2 \\ 3\end{bmatrix}. Therefore, by definition, the eigenspace corresponding to eigenvalue 6 will have the span of x\bold{x}.

Note: There are infinitely many eigenvectors corresponding to each eigenvalue of a matrix.

Below is a visualization that shows how eigenvectors of a matrix preserve their direction under the linear transformation of that matrix. The red and blue vectors are eigenvectors corresponding to the eigenvalue, 66. These vectors kept their direction under the matrix A transformation. In contrast, a brown vector loses its direction.

Note: A matrix may have several distinct eigenvalues and an eigenspace corresponds to each of them.

Computing eigenvalues and eigenvectors in numpy

In numpy, we can compute eigenvalues and the basis of corresponding eigenspaces using the np.linalg.eig function. Let’s try it out for the matrix in the example above.

Python 3.8
import numpy as np
from numpy.linalg import eig
A = np.array([[3, 2], [3, 4]], dtype=np.float32)
eigVals, eigVecs = eig(A)
print(f'Eigenvalues of\n{A}\n are\n{eigVals}')
print(f'The corresponding eigenspace-basis vectors are the columns of \n{np.round(eigVecs, 2)}')

For the matrix AA in the example above, there are 22 distinct eigenvalues and the corresponding eigenspaces. This isn’t, however, generalizable. A general n×nn\times n matrix AA may not have nn distinct eigenvalues. An eigenspace can have more than one dimension, but the sum of the dimensions of all the eigenspaces of AA can’t be larger than nn.

Eigenspace vs. null space

The eigenvectors of a matrix, AA, satisfy Ax=λxA\bold{x=\lambda x} for some scalar λ\lambda. We can rewrite this as (AλI)x=0(A-\lambda I)\bold{x=0}, where II is the identity matrix. In this way, we discover that the eigenspace of a matrix, AA, corresponding to the eigenvalue, λ\lambda, is the null space of the matrix AλIA-\lambda I.

Moreover, for a null space of AλIA-\lambda I to exist beyond zero-dimensional, λ\lambda must force singularity on AλIA-\lambda I. This implies

det(AλI)=0det(A-\lambda I)=0

Note: A matrix, AA, may or may not be singular, but the matrix, AλIA-\lambda I, is always singular, where λ\lambda is an eigenvalue of AA.

Eigenvalues computation

To compute the eigenvalues of a matrix, AA, we need to solve the equation, det(AλI)=0det(A-\lambda I)=0 for λ\lambda. For each value of λ\lambda, a corresponding eigenvector, x\bold{x}, can be found by solving (AλI)x=0(A-\lambda I)\bold{x}=0.

Example

The eigenvalues of A=[3234]A=\begin{bmatrix}3 & 2\\3 & 4\end{bmatrix} can be computed as follows:

3λ234λ=0(3λ)(4λ)6=0λ27λ+6=0λ2λ6λ+6=0λ(λ1)6(λ1)=0(λ1)(λ6)=0λ{1,6}\begin{align*} &\begin{vmatrix}3-\lambda & 2\\3 & 4-\lambda\end{vmatrix}&=0 \\ &(3-\lambda)(4-\lambda)-6 &=0 \\ &\lambda^2-7\lambda+6&=0 \\ &\lambda^2-\lambda-6\lambda+6&=0 \\ &\lambda(\lambda-1)-6(\lambda-1)&=0 \\ &(\lambda-1)(\lambda-6)&=0 \\ &\lambda \in \{1,6\} \end{align*}

We can now use both values of λ\lambda to calculate a corresponding eigenvector. Let’s find an eigenvector, x\bold{x}, that corresponds to λ=1\lambda=1. To that end, we have to find the solution of (AλI)x=0(A-\lambda I)\bold{x}=0

[2233][x1x2]=[00][110000][x1x2]=[11]\begin{align*} &\begin{bmatrix}2 & 2\\3 & 3\end{bmatrix} \begin{bmatrix}x_1 \\ x_2\end{bmatrix}= \begin{bmatrix}0 \\ 0\end{bmatrix}\\ &\left [\begin{array}{cc|c} 1 & 1 & 0 \\ 0 & 0 & 0 \end{array}\right ]\\ &\begin{bmatrix}x_1 \\ x_2\end{bmatrix}=\begin{bmatrix} -1 \\ 1\end{bmatrix} \end{align*}

Thus, x=[11]\bold{x}=\begin{bmatrix} -1 \\ 1\end{bmatrix} is an eigenvector corresponding to the eigenvalue of 11. Furthermore, all multiples of x\bold{x} are also eigenvectors corresponding to the eigenvalue of 11. All these eigenvectors make an eigenspace. Moreover, there’s another eigenspace corresponding to the eigenvalue of 66.

Eigenspace of a singular matrix

In the case that a matrix, AA, is singular, one of its eigenvalues must be equal to 00. Furthermore, the null space of AA is the same as the eigenspace of AA corresponding to eigenvalue of 00.

Ax=0    Ax=0xA\bold{x=0}\implies A\bold{x}=0\bold{x}

Properties of eigenvalues and eigenvectors

  1. The sum of eigenvalues is the trace of the matrix.
  2. The product of eigenvalues is the determinant of the matrix.
Python 3.8
import numpy as np
from numpy.linalg import det, eig
A = np.random.rand(3, 3)
eigVals, eigVecs = eig(A)
# Verify product of eigenvalues equals determinant.
print(f'Product of Eigenvalues = {np.prod(eigVals)}')
print(f'Determinant of the matrix = {det(A)}')
# Verify sum of eigenvalues equals trace.
print(f'Sum of Eigenvalues = {np.sum(eigVals)}')
print(f'Trace of the matrix = {np.trace(A)}')

Note: That the product of eigenvalues is the determinant of the matrix corresponds to the statement that, for singular matrices, at least one eigenvalue equals zero.

  1. The eigenspaces corresponding to different eigenvalues are linearly independent. That is, the union of the basis of all eigenspaces is a linearly-independent set.
Python 3.8
import numpy as np
from numpy.linalg import eig, matrix_rank as rank
A = np.random.rand(10, 10)
eigVals, eigVecs = eig(A)
print(f'Number of basis vectors of all eigenspaces = {eigVecs.shape[1]}')
print(f'Number of linearly independent eigenvectors = {rank(eigVecs)}')
  1. The eigenvalues of a symmetric matrix are real.
  2. The eigenspaces of a symmetric matrix are mutually orthogonal.
Python 3.8
import numpy as np
from numpy.linalg import eig
A = np.array([[1, 2], [2, 3]])
eigVals, eigVecs = eig(A)
print(f'A=\n{A}\nEigenvalues(A)={eigVals}')
print(f'Dot product of eigenvectors = {eigVecs[:, 0].dot(eigVecs[:, 1])}')
  1. If λ\lambda is an eigenvalue of AA, then 1λ\frac{1}{\lambda} will be an eigenvalue of A1A^{-1} and the respective eigenspaces will be the same.
Python 3.8
import numpy as np
from numpy.linalg import inv, eig
A = np.random.rand(2, 2)
eigVals, eigVecs = eig(A)
eigVals_inv, eigVecs_inv = eig(inv(A))
# Print eigenvalues of A and inverse(A)
print(f'A=\n{A}\nEigenvals(A)={eigVals}\n1/Eigenvals(inv(A))={1/eigVals_inv}')
# Print eigenspaces of A and inverse(A)
print(f'Eigenvectors(A)\n{eigVecs}\nEigenvectors(inv(A))\n{eigVecs_inv}')
  1. The eigenvalues of AA and ATA^T are the same.
Python 3.8
import numpy as np
from numpy.linalg import eig
A = np.random.rand(2, 2)
eigVals, eigVecs = eig(A)
eigVals_T, eigVecs_T = eig(A.T)
print(f'A\n{A}\nEigenvalues(A)={eigVals}\nEigenvalues(transpose(A))={eigVals_T}')
  1. The eigenvalues of a triangular matrix are the diagonal entries of the matrix.