Column Space
Learn to define and compute the column space of a matrix, identify its basis via reduced row echelon form, and understand its role in linear transformations, vector spaces, and consistency of linear systems in data science.
Definition
The column space of a matrix, , often denoted by , is a vector space spanned by the column vectors of .
The column-space of an matrix is a subspace of . The dimensions of the column space are the number of linearly independent columns, that is, the rank of the matrix .
Examples
-
The column space of is .
-
The column space of is a one-dimensional subspace of . This is because there’s a single linearly independent vector as . Any of these columns can be a basis for the subspace.
-
The column space of is a zero-dimensional subspace of .
-
The column space of an invertible matrix is . This is because all the columns of such a matrix are linearly independent. That is, .
The columns of every invertible matrix form a basis of .
Basis of column space
A basis of the column space of a matrix, , is a set of linearly independent columns of . One algorithm for computing the basis is as follows:
-
Take the transpose of the matrix to convert the columns into rows.
-
Compute the (reduced row echelon form) of the transposed matrix.
-
The set of non-zero rows in is a linearly-independent set. These form the basis of the column space.
Note that the basis found by the algorithm above may not contain the original column vectors, but their linear combinations as elementary row operations have transformed the matrix into .
Let’s try it out by using the following code:
Verifying a basis of column space
Given a set, , of linearly-independent vectors from , how do we know that the set is a basis for the column space of a given matrix ? To do this, we need to determine the following:
-
Are all the columns of in the span of ? If they are, it will prove that .
-
Are all the vectors in in the column space of ?. If they are, it will will prove that .
Both points above together prove that . Here’s one algorithm for this problem:
- Create a matrix with columns that are the vectors in .
- Check whether the ranks of and are equal, say .
- Create a matrix, , with columns of and (in any order).
- If the rank of the matrix, , is also , then is a basis of .
The code below implements the algorithm above.
Column space and linear transformation
The multiplication of a matrix with a vector can be viewed as a linear transformation of the vector. Consider an matrix and a vector, . If we look closely at the matrix, , represented in column vectors as and the vector in its individual components, namely , then
Note: A linear transformation matrix, , maps every vector to the column space of .
Examples
- The column space of is a line through the origin in the direction of the vector, . If matrix is multiplied with any vector in , it maps the vector onto that line. Below is its visualization:
- The column space of is a plane through the origin in , with the first column vectors as the basis. The visualization below shows how all the vectors on a cube map onto the plane when transformed using . This is true for every vector in .
- The column space of is , but the
of (represented by a identity matrix) transforms to the columns of the matrix. The visualization below shows all the vectors on a cube transforming to other vectors but not collapsing to any plane or a line. This is also true for all the other vectors in .standard basis Standard basis of a vector space is the basis where each vector has exactly one non-zero value equalling 1.
A linear system, , is consistent if . For invertible coefficient matrices, every vector is in the column space. So, the system is always consistent and an exact solution can be obtained through the inverse of . That is, .