Finding span and checking linear dependence

On this page, we find a way to check whether given vectors are linearly dependent or independent.

Adding multiples doesn't affect linear dependence

Let $\red{\vec u}$, $\blue{\vec v}$ and $\green{\vec w}$ be linearly independent vectors. If $\red{\vec u}$, $\blue{\vec v}$ and $\green{\vec w} + 2\blue{\vec v}$ are linearly dependent, we get a contradiction:

So if $\red{\vec u}$, $\blue{\vec v}$ and $\green{\vec w}$ are linearly independent, then $\red{\vec u}$, $\blue{\vec v}$ and $\green{\vec w} + 2\blue{\vec v}$ must also be linearly independent: adding $2\blue{\vec v}$ to $\green{\vec w}$ preserves the linear independence.

The same also works with dependent instead of independent: if $\red{\vec u}$, $\blue{\vec v}$ and $\green{\vec w}$ are linearly dependent, then $\red{\vec u}$, $\blue{\vec v}$ and $\green{\vec w} + 2\blue{\vec v}$ are also linearly dependent. They cannot be independent, because if they were, then you could add $-2\blue{\vec v}$ to get $\green{\vec w} + 2\blue{\vec v}$ back to just $\green{\vec w}$; this would lead to $\red{\vec u}$, $\blue{\vec v}$ and $\green{\vec w}$ being linearly independent.

These results work with any number instead of $2$ (in other words, with any multiple of $\blue{\vec v}$ instead of $2\blue{\vec v}$).

With two or more vectors, adding a multiple of one vector to another doesn't affect linear dependence or independence.

Here a multiple of a vector means the vector multiplied by some number. For example, $5\red{\vec u}$, $-2\red{\vec u}$ and $0\red{\vec u}$ are multiples of $\red{\vec u}$.

Checking linear dependence/independence

Using the above result, you can check whether given vectors are linearly dependent or independent by repeatedly adding multiples of one vector to another. In this context, it is common to write the vectors in a matrix so that each row of the matrix represents one vector. The goal is to get as much zeros as possible, so that you can easily see whether the vectors are dependent or independent.

For example, let's check if the vectors $(2, 4, 4)$, $(0, 2, 0)$ and $(4, 2, 8)$ are linearly dependent or independent.

$$\begin{align} \begin{bmatrix} \red{2} & \red{4} & \red{4} \\ \blue{0} & \blue{2} & \blue{0} \\ \magenta{4} & \magenta{2} & \magenta{8} \\ \end{bmatrix} &\to \begin{bmatrix} \red{2} & \red{4} & \red{4} \\ \blue{0} & \blue{2} & \blue{0} \\ \green{4} & \green{0} & \green{8} \\ \end{bmatrix} \quad \green{\text{new bottom}} = \magenta{\text{old bottom}} + \left( -1 \right) \cdot \blue{\text{middle}} \\ &\to \begin{bmatrix} \darkyellow{2} & \darkyellow{0} & \darkyellow{4} \\ \blue{0} & \blue{2} & \blue{0} \\ \green{4} & \green{0} & \green{8} \\ \end{bmatrix} \quad \darkyellow{\text{new top}} = \red{\text{old top}} + \left( -2 \right) \cdot \blue{\text{middle}} \\ &\to \begin{bmatrix} \darkyellow{2} & \darkyellow{0} & \darkyellow{4} \\ \blue{0} & \blue{2} & \blue{0} \\ \red{0} & \red{0} & \red{0} \\ \end{bmatrix} \quad \red{\text{new bottom}} = \green{\text{old bottom}} + \left( -2 \right) \cdot \darkyellow{\text{top}} \\ \end{align}$$

The numbers to multiply by are chosen so that we get new zeros. For example, the $-1$ was chosen because that resulted in $\magenta{2} + (-1)\blue{2} = \green0$ in the bottom row.

Because one of the resulting vectors is $\red{(0,0,0)}$, it is a linear combination of other vectors: $$ \red{(0,0,0)} = 0\darkyellow{(2,0,4)} + 0\blue{(0,2,0)} $$ This means that the resulting vectors $\darkyellow{(2,0,4)}$, $\blue{(0,2,0)}$ and $\red{(0,0,0)}$ are linearly dependent, so the original vectors $(2, 4, 4)$, $(0, 2, 0)$ and $(4, 2, 8)$ are also linearly dependent.

Finding the span

Consider the spans $$ \span(\red{\vec u}, \blue{\vec v}) \quad \text{and} \quad \span(\red{\vec u}, \green{\vec v + 2\vec u}). $$ Because $\green{\vec v + 2\vec u}$ is a linear combination of $\red{\vec u}$ and $\blue{\vec v}$, adding it to their span doesn't change anything: $$ \span(\red{\vec u}, \blue{\vec v}) = \span(\red{\vec u}, \blue{\vec v}, \green{\vec v + 2\vec u}). $$ On the other hand, $$ \blue{\vec v} = \green{\vec v + 2\vec u} - 2\red{\vec u} $$ is a linear combination of $\green{\vec v + 2\vec u}$ and $\red{\vec u}$, so $$ \span(\red{\vec u}, \green{\vec v + 2\vec u}) = \span(\red{\vec u}, \blue{\vec v}, \green{\vec v + 2\vec u}). $$ By putting it all together, we get $$ \span(\red{\vec u}, \blue{\vec v}) = \span(\red{\vec u}, \green{\vec v + 2\vec u}). $$ This also works with any number instead of 2, and also with more than two vectors.

The span doesn't change when adding a multiple of one vector to another.

This means that you can find a span in the exact same way as you would check for linear dependence or independence (see above): add multiples of the vectors to each other until it is clear what the span will be.

For example, let's find $\span((2,4,5), (0,-5,-9), (3,5,5))$:

$$\begin{align} \begin{bmatrix} \red{2} & \red{4} & \red{5} \\ \blue{0} & \blue{-5} & \blue{-9} \\ \magenta{3} & \magenta{5} & \magenta{5} \\ \end{bmatrix} &\to \begin{bmatrix} \red{2} & \red{4} & \red{5} \\ \blue{0} & \blue{-5} & \blue{-9} \\ \green{0} & \green{-1} & \green{-\frac{5}{2}} \\ \end{bmatrix} \quad \green{\text{new bottom}} = \magenta{\text{old bottom}} + \left( -\frac{3}{2} \right) \cdot \red{\text{top}} \\ &\to \begin{bmatrix} \darkyellow{2} & \darkyellow{0} & \darkyellow{-5} \\ \blue{0} & \blue{-5} & \blue{-9} \\ \green{0} & \green{-1} & \green{-\frac{5}{2}} \\ \end{bmatrix} \quad \darkyellow{\text{new top}} = \red{\text{old top}} + 4 \cdot \green{\text{bottom}} \\ &\to \begin{bmatrix} \darkyellow{2} & \darkyellow{0} & \darkyellow{-5} \\ \red{0} & \red{0} & \red{\frac{7}{2}} \\ \green{0} & \green{-1} & \green{-\frac{5}{2}} \\ \end{bmatrix} \quad \red{\text{new middle}} = \blue{\text{old middle}} + \left( -5 \right) \cdot \green{\text{bottom}} \\ &\to \begin{bmatrix} \blue{2} & \blue{0} & \blue{0} \\ \red{0} & \red{0} & \red{\frac{7}{2}} \\ \green{0} & \green{-1} & \green{-\frac{5}{2}} \\ \end{bmatrix} \quad \blue{\text{new top}} = \darkyellow{\text{old top}} + \frac{10}{7} \cdot \red{\text{middle}} \\ &\to \begin{bmatrix} \blue{2} & \blue{0} & \blue{0} \\ \red{0} & \red{0} & \red{\frac{7}{2}} \\ \magenta{0} & \magenta{-1} & \magenta{0} \\ \end{bmatrix} \quad \magenta{\text{new bottom}} = \green{\text{old bottom}} + \frac{5}{7} \cdot \red{\text{middle}} \\ \end{align}$$

This calculation shows two things:

Multiplying a row

In the previous example, we ended up having to calculate with fractions, which is error-prone when calculating by hand. To avoid fractions, you can multiply a row by 2 whenever a fraction like $\frac{7}{2}$ appears in it. The same works with any other nonzero number instead of 2.

This doesn't affect the span: because the span consists of all linear combinations, the vectors get multiplied by all numbers anyway, and the original length of the vectors doesn't matter. The same goes for linear dependence and independence. Let me know if this isn't clear to you and you would like to see more details.

Here's the same calculation again, but with this trick in use:

$$\begin{align} \begin{bmatrix} \red{2} & \red{4} & \red{5} \\ \blue{0} & \blue{-5} & \blue{-9} \\ \magenta{3} & \magenta{5} & \magenta{5} \\ \end{bmatrix} &\to \begin{bmatrix} \red{2} & \red{4} & \red{5} \\ \blue{0} & \blue{-5} & \blue{-9} \\ \green{0} & \green{-1} & \green{-\frac{5}{2}} \\ \end{bmatrix} \quad \green{\text{new bottom}} = \magenta{\text{old bottom}} + \left( -\frac{3}{2} \right) \cdot \red{\text{top}} \\ &\to \begin{bmatrix} \red{2} & \red{4} & \red{5} \\ \blue{0} & \blue{-5} & \blue{-9} \\ \darkyellow{0} & \darkyellow{-2} & \darkyellow{-5} \\ \end{bmatrix} \quad \darkyellow{\text{new bottom}} = 2 \cdot \green{\text{old bottom}} \\ &\to \begin{bmatrix} \magenta{2} & \magenta{0} & \magenta{-5} \\ \blue{0} & \blue{-5} & \blue{-9} \\ \darkyellow{0} & \darkyellow{-2} & \darkyellow{-5} \\ \end{bmatrix} \quad \magenta{\text{new top}} = \red{\text{old top}} + 2 \cdot \darkyellow{\text{bottom}} \\ &\to \begin{bmatrix} \magenta{2} & \magenta{0} & \magenta{-5} \\ \green{0} & \green{0} & \green{\frac{7}{2}} \\ \darkyellow{0} & \darkyellow{-2} & \darkyellow{-5} \\ \end{bmatrix} \quad \green{\text{new middle}} = \blue{\text{old middle}} + \left( -\frac{5}{2} \right) \cdot \darkyellow{\text{bottom}} \\ &\to \begin{bmatrix} \magenta{2} & \magenta{0} & \magenta{-5} \\ \red{0} & \red{0} & \red{7} \\ \darkyellow{0} & \darkyellow{-2} & \darkyellow{-5} \\ \end{bmatrix} \quad \red{\text{new middle}} = 2 \cdot \green{\text{old middle}} \\ &\to \begin{bmatrix} \magenta{2} & \magenta{0} & \magenta{-5} \\ \blue{0} & \blue{0} & \blue{1} \\ \darkyellow{0} & \darkyellow{-2} & \darkyellow{-5} \\ \end{bmatrix} \quad \blue{\text{new middle}} = \frac{1}{7} \cdot \red{\text{old middle}} \\ &\to \begin{bmatrix} \green{2} & \green{0} & \green{0} \\ \blue{0} & \blue{0} & \blue{1} \\ \darkyellow{0} & \darkyellow{-2} & \darkyellow{-5} \\ \end{bmatrix} \quad \green{\text{new top}} = \magenta{\text{old top}} + 5 \cdot \blue{\text{middle}} \\ &\to \begin{bmatrix} \green{2} & \green{0} & \green{0} \\ \blue{0} & \blue{0} & \blue{1} \\ \red{0} & \red{-2} & \red{0} \\ \end{bmatrix} \quad \red{\text{new bottom}} = \darkyellow{\text{old bottom}} + 5 \cdot \blue{\text{middle}} \\ \end{align}$$

Now there are more steps, but less fractions.

Note that multiplying a row by a number is optional, and you don't have to do it if you don't want to. I introduce it here only to avoid dealing with annoying fractions.

Maximum number of independent vectors

Let's check whether the vectors $(2,1,1)$, $(1,5,1)$, $(2,2,3)$ and $(4,2,1)$ are linearly independent or dependent. First, we use the first vector to make all other vectors start with zero.

$$\begin{align} \begin{bmatrix} \red{2} & \red{1} & \red{1} \\ \blue{1} & \blue{5} & \blue{1} \\ \magenta{2} & \magenta{2} & \magenta{3} \\ \green{4} & \green{2} & \green{1} \\ \end{bmatrix} &\to \begin{bmatrix} \red{2} & \red{1} & \red{1} \\ \darkyellow{0} & \darkyellow{\frac{9}{2}} & \darkyellow{\frac{1}{2}} \\ \magenta{2} & \magenta{2} & \magenta{3} \\ \green{4} & \green{2} & \green{1} \\ \end{bmatrix} \quad \darkyellow{\text{new row 2}} = \blue{\text{old row 2}} + \left( -\frac{1}{2} \right) \cdot \red{\text{row 1}} \\ &\to \begin{bmatrix} \red{2} & \red{1} & \red{1} \\ \darkyellow{0} & \darkyellow{\frac{9}{2}} & \darkyellow{\frac{1}{2}} \\ \blue{0} & \blue{1} & \blue{2} \\ \green{4} & \green{2} & \green{1} \\ \end{bmatrix} \quad \blue{\text{new row 3}} = \magenta{\text{old row 3}} + \left( -1 \right) \cdot \red{\text{row 1}} \\ &\to \begin{bmatrix} \red{2} & \red{1} & \red{1} \\ \darkyellow{0} & \darkyellow{\frac{9}{2}} & \darkyellow{\frac{1}{2}} \\ \blue{0} & \blue{1} & \blue{2} \\ \magenta{0} & \magenta{0} & \magenta{-1} \\ \end{bmatrix} \quad \magenta{\text{new row 4}} = \green{\text{old row 4}} + \left( -2 \right) \cdot \red{\text{row 1}} \\ \end{align}$$

We can now use the second vector to make all vectors below it start with two zeros. The last row already starts with two zeros, so we don't need to do anything to it.

$$\begin{align} \begin{bmatrix} \red{2} & \red{1} & \red{1} \\ \darkyellow{0} & \darkyellow{\frac{9}{2}} & \darkyellow{\frac{1}{2}} \\ \blue{0} & \blue{1} & \blue{2} \\ \magenta{0} & \magenta{0} & \magenta{-1} \\ \end{bmatrix} &\to \begin{bmatrix} \red{2} & \red{1} & \red{1} \\ \darkyellow{0} & \darkyellow{\frac{9}{2}} & \darkyellow{\frac{1}{2}} \\ \green{0} & \green{0} & \green{\frac{17}{9}} \\ \magenta{0} & \magenta{0} & \magenta{-1} \\ \end{bmatrix} \quad \green{\text{new row 3}} = \blue{\text{old row 3}} + \left( -\frac{2}{9} \right) \cdot \darkyellow{\text{row 2}} \\ \end{align}$$

Finally, we can use the third vector to make the last vector all zeros:

$$\begin{align} \begin{bmatrix} \red{2} & \red{1} & \red{1} \\ \darkyellow{0} & \darkyellow{\frac{9}{2}} & \darkyellow{\frac{1}{2}} \\ \green{0} & \green{0} & \green{\frac{17}{9}} \\ \magenta{0} & \magenta{0} & \magenta{-1} \\ \end{bmatrix} &\to \begin{bmatrix} \red{2} & \red{1} & \red{1} \\ \darkyellow{0} & \darkyellow{\frac{9}{2}} & \darkyellow{\frac{1}{2}} \\ \green{0} & \green{0} & \green{\frac{17}{9}} \\ \blue{0} & \blue{0} & \blue{0} \\ \end{bmatrix} \quad \blue{\text{new row 4}} = \magenta{\text{old row 4}} + \frac{9}{17} \cdot \green{\text{row 3}} \\ \end{align}$$

Because we ended up with a zero row, the vectors we started with are linearly dependent. However, with some handling for a few special cases, the same process works with any four 3D vectors, and you can always do this to end up with a matrix that has zeros in these locations: $$ \begin{bmatrix} &&\\ 0&&\\ 0&0&\\ 0&0&0 \end{bmatrix} $$ To ensure that the first column has the needed zeros, we can do this:

Once this is done, we can ignore the first row and first column of the matrix, and repeat the same process with the remaining two columns and three rows. Because these rows already have zeros in the leftmost column, adding multiples of them to each other won't change the leftmost column. After repeating this for all columns, ignoring one more row every time we move on to the next column, we have the zeros.

This shows that there can't be 4 linearly independent 3D vectors. We also can't have more than 4, because then we could pick 4 of them, and they would be linearly independent. The same process works with any other dimension $n$ instead of 3.

There cannot be more than $n$ linearly independent $n$-dimensional vectors.

Spanning with just linear independence

Let $\red{\vec u}$, $\blue{\vec v}$ and $\green{\vec w}$ be linearly independent 3D vectors. If their span doesn't contain all 3D vectors, there's a 3D vector $\magenta{\vec m}$ that is not in $\span(\red{\vec u}, \blue{\vec v}, \green{\vec w})$. Then $\red{\vec u}$, $\blue{\vec v}$, $\green{\vec w}$ and $\magenta{\vec m}$ are linearly independent:

By our previous result, it is not possible to have 4 linearly independent 3D vectors. This means that any span of any 3 linearly independent 3D vectors must contain all 3D vectors. The same works with any other dimension instead of 3.

The span of any $n$ linearly independent $n$-dimensional vectors is the set of all $n$-dimensional vectors.

This is basically a generalization of $\span((1,0,0),(0,1,0),(0,0,1)) = \mathbb{R}^3$ that works with any linearly independent vectors.