מילון מונחים

בחר אחת ממילות המפתח משמאל...

Linear AlgebraLinear Independence

זמן קריאה: ~20 min

The idea of redundancy that we discussed in the introduction can now be phrased in a mathematically precise way: a list of vectors is linearly dependent if one of the vectors can be expressed as a linear combination of the others.

A list of vectors which is not linearly dependent is said to be linearly independent. In other words, a list of vectors is linearly independent if of the vectors in the list can be written as a linear combination of the others.

Example
The list of vectors \{\mathbf{u}_1, \mathbf{u}_2, \mathbf{u}_3\} where \mathbf{u}_1 = \begin{bmatrix} 1 \\\ 1 \\\ 2 \end{bmatrix}, \mathbf{u}_2 = \begin{bmatrix} 0 \\\ 1\\\ 0 \end{bmatrix}, \mathbf{u}_3 = \begin{bmatrix} 4 \\\ 7\\\ 8 \end{bmatrix} is not linearly independent, since \mathbf{u}_3 = 4\mathbf{u}_1 + 3\mathbf{u}_2.

The list of vectors \{\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3\} where \mathbf{v}_1 = \begin{bmatrix} 1 \\\ 0 \\\ 0 \end{bmatrix}, \mathbf{v}_2 = \begin{bmatrix} 0 \\\ 1 \\\ 0 \end{bmatrix}, \mathbf{v}_3 = \begin{bmatrix} 0 \\\ 0 \\\ 1 \end{bmatrix} is linearly independent, since any linear combination of \mathbf{v}_1 and \mathbf{v}_2 is unequal to \mathbf{v}_3, and similarly for \mathbf{v}_1 and \mathbf{v}_2.

Exercise
Explain geometrically why a list of three vectors in \mathbb{R}^2 is necessarily linearly dependent.

Solution. If any vector in the list is zero, then the list is linearly independent, since the zero vector can be written as the sum of zero times each of the other vectors. So we may assume that the vectors are .

If the first two vectors point in the same direction, then the list is linearly , since the second vector can be written as a constant multiple of the first vector plus zero times the third vector. If the first two vectors do not point in the same direction, then they span the whole plane. Therefore, the third vector must be in the span of the first two.

Linear dependence lemma

The definition of linear independence makes it seem as though there's quite a lot to check: if there is a vector in the list which can be written as a linear combination of some of the other ones, which one is it, and which other vectors are involved? In fact, the symmetry involved in linear relationships implies that we can put the vectors in any order we want and work through the list, checking whether each vector is in the span of the vectors earlier in the list:

Theorem (Linear dependence lemma)
A list of vectors is linearly independent if and only if there is no vector in the list which is in the span of the preceding vectors.

For example, to check that \{\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3\} is linearly independent, it suffices to check that \mathbf{v}_1 \neq \boldsymbol{0}, that \mathbf{v}_2 is not a scalar multiple of \mathbf{v}_1 and that \mathbf{v}_3 is not in the span of \{\mathbf{v}_1, \mathbf{v}_2\}.

Let's walk through a proof of this theorem.

Proof. If a list is linearly independent, then no vector in the list can be represented as a linear combination of others (by definition), so no vector can be in the span of the previous ones. This shows that linear independence the condition of having no vector in the span of the preceding ones.

For the other direction, suppose that the list \mathbf{v}_1, \ldots, \mathbf{v}_n is linearly dependent. Then, one of the vectors can be written as a linear combination of the others. For example, if \mathbf{v}_1 can be written as a linear combination of the others, then

\begin{align*}\mathbf{v}_1 = c_2\mathbf{v}_2 + \cdots + c_n\mathbf{v}_n\end{align*}

for some weights c_2, \ldots, c_n. If all of the weights are zero, then \mathbf{v}_1 is zero and is therefore in the span of the empty list of vectors which precede it. If at least one is nonzero, then let's define k so that c_k is the of the nonzero c's. Then we can rearrange the equation above to find that

\begin{align*}\mathbf{v}_k = \frac{\mathbf{v}_1 - \left(c_2\mathbf{v}_2 + \cdots + c_{k- 1}\mathbf{v}_{k-1}\right)}{c_k}\end{align*}

which is in the span of \{\mathbf{v}_1,...,\mathbf{v}_{k-1}\}.

So the list does not satisfy the condition of having no vector in the span of the preceding ones. Similar reasoning would apply if we had chosen any vector other than \mathbf{v}_1 as the one which can be written as a linear combination of the others. Therefore, we conclude that linear independence does imply failure to satisfy the given condition.

From logic, we know that "A implies B" is equivalent to its "not B implies not A". Therefore, we can say that satisfying the condition of having no vector in the span of the preceding ones does imply linear independence.

Exercise
Let's say that a linear combination of a list of vectors is trivial if all of the weights are zero.

Show that a list of vectors is linearly independent if and only if every nontrivial linear combination of the vectors is not equal to the zero vector.

Solution. Suppose that a list of vectors \{\mathbf{v}_1, \ldots, \mathbf{v}_n\} is not linearly independent. Then one of the vectors, say the first one, is equal to some linear combination of the others:

\begin{align*}\mathbf{v}_1 = c_2\mathbf{v}_2 + \cdots + c_n \mathbf{v}_n\end{align*}

Subtracting \mathbf{v}_1 from both sides of this equation, we obtain a nontrivial linear combination of the \mathbf{v}'s which is equal to . (If the vector known to be a linear combination of the others isn't \mathbf{v}_1, we could have done the same thing with that one instead.)

Conversely, suppose that there is a nontrivial linear combination of the \mathbf{v}'s which is equal to the zero vector:

\begin{align*}c_1 \mathbf{v}_1 + c_2 \mathbf{v}_2 + \cdots c_n \mathbf{v}_n = \boldsymbol{0}.\end{align*}

At least one of the weights must be nonzero, so we can solve this equation for a least one of the vectors and thereby represent it as of the other vectors.

Bruno
Bruno Bruno