Verifying Linear Independence

Learn to verify if given vectors are linearly independent, and if not, find the relationship between the dependent vectors.

Linear dependence

We defined linear dependence as a collection with linearly dependent vectors if, given a collection of vectors x1,x2,,xn\bold x_1, \bold x_2, \cdots, \bold x_n, there exists a combination of scalars w1,w2,,wnw_1,w_2,\cdots,w_n with at least one nonzero value such that

w1x1+w2x2+...+wnxn=0w_1\bold{x_1}+w_2\bold{x_2}+...+w_n\bold{x_n}=\bold{0}

There could be infinitely many possible linear combinations, and it’s not possible to check them one by one. We’ve already discussed a few special cases and tricks to check for linear dependence in a combination. This lesson will use elimination as a formal method to solve the vector equation and find the possible combinations.

Procedure

Our goal is to find the combination of scalars w1,w2,,wnw_1,w_2,\cdots,w_n, such that

w1x1+w2x2+...+wnxn=0w_1\bold{x_1}+w_2\bold{x_2}+...+w_n\bold{x_n}=\bold{0}

Expanding the vectors, we get

w1[x11x12x1m]+w2[x21x22x2m]++wn[xn1xn2x1m]=[000]w_1 \begin{bmatrix} x_{11} \\ x_{12} \\ \vdots \\ x_{1m} \end{bmatrix} + w_2 \begin{bmatrix} x_{21} \\ x_{22} \\ \vdots \\ x_{2m} \end{bmatrix} + \cdots + w_n \begin{bmatrix} x_{n1} \\ x_{n2} \\ \vdots \\ x_{1m} \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ \vdots \\ 0 \end{bmatrix}

The equation can be converted to a linear system of mm equations in nn variables. The augmented matrix representation of the system is as follows:

(x11x21xn10x12x22xn20x1mx2mxnm0)\left( \begin{array}{cccc|c} x_{11} & x_{21} & \cdots & x_{n1} & 0 \\ x_{12} & x_{22} & \cdots & x_{n2} & 0 \\ \vdots & \vdots & \ddots & \vdots & \vdots\\ x_{1m} & x_{2m} & \cdots & x_{nm} & 0 \\ \end{array}\right)

Because the vector b\bold b is a zero vector, the system is of the form Aw=0A\bold{w}=\bold{0} and is always consistent. The two possible solution sets, “only trivial solution” and “infinite solutions,” provide evidence for linear independence and linear dependence, respectively.

Only trivial solution

Suppose the rrefrref of AA for the system Aw=0A\bold w=\bold 0 results in an identity matrix (that is, it has no free variable). In such a case, the only possible solution is the trivial solution, w=0\bold w=\bold 0. This means that the only combination of scalars satisfying the vector equation is all zeros, and the vectors included in the collection are independent of each other.

Example

Consider the following set of vectors:

x1=[524],x2=[011],x3=[10915]\bold {x_1} = \begin{bmatrix} 5 \\ 2 \\ 4 \end{bmatrix}, \bold {x_2} = \begin{bmatrix} 0 \\ 1 \\ 1 \end{bmatrix}, \bold {x_3} = \begin{bmatrix} 10 \\ 9 \\ 15 \end{bmatrix}

Writing the vector equation, we get

w1[524]+w2[011]+w3[10915]=[000]w_1 \begin{bmatrix} 5 \\ 2 \\ 4 \end{bmatrix} + w_2 \begin{bmatrix} 0 \\ 1 \\ 1 \end{bmatrix} + w_3 \begin{bmatrix} 10 \\ 9 \\ 15 \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ 0 \end{bmatrix}

Our next step is to write an augmented matrix and apply elimination.

(50100219041150)\left( \begin{array}{ccc|c} 5 & 0 & 10 & 0 \\ 2 & 1 & 9 & 0 \\ 4 & 1 & 15 & 0 \\ \end{array}\right)

Get hands-on with 1200+ tech skills courses.