Linear Algebra/Definition and Examples of Linear Independence

Linear Algebra
 ← Linear Independence Definition and Examples of Linear Independence Basis and Dimension → 

Spanning Sets and Linear Independence

edit

We first characterize when a vector can be removed from a set without changing the span of that set.

Lemma 1.1

Where   is a subset of a vector space  ,

 

for any  .

Proof

The left to right implication is easy. If   then, since  , the equality of the two sets gives that  .

For the right to left implication assume that   to show that   by mutual inclusion. The inclusion   is obvious. For the other inclusion  , write an element of   as   and substitute  's expansion as a linear combination of members of the same set  . This is a linear combination of linear combinations and so distributing   results in a linear combination of vectors from  . Hence each member of   is also a member of  .

Example 1.2

In  , where

 

the spans   and   are equal since   is in the span  .

The lemma says that if we have a spanning set then we can remove a   to get a new set   with the same span if and only if   is a linear combination of vectors from  . Thus, under the second sense described above, a spanning set is minimal if and only if it contains no vectors that are linear combinations of the others in that set. We have a term for this important property.

Definition 1.3

A subset of a vector space is linearly independent if none of its elements is a linear combination of the others. Otherwise it is linearly dependent.

Here is an important observation:

 

although this way of writing one vector as a combination of the others visually sets   off from the other vectors, algebraically there is nothing special in that equation about  . For any   with a coefficient   that is nonzero, we can rewrite the relationship to set off  .

 

When we don't want to single out any vector by writing it alone on one side of the equation we will instead say that   are in a linear relationship and write the relationship with all of the vectors on the same side. The next result rephrases the linear independence definition in this style. It gives what is usually the easiest way to compute whether a finite set is dependent or independent.

Lemma 1.4

A subset   of a vector space is linearly independent if and only if for any distinct   the only linear relationship among those vectors

 

is the trivial one:  .

Proof

This is a direct consequence of the observation above.

If the set   is linearly independent then no vector   can be written as a linear combination of the other vectors from   so there is no linear relationship where some of the  's have nonzero coefficients. If   is not linearly independent then some   is a linear combination   of other vectors from  , and subtracting   from both sides of that equation gives a linear relationship involving a nonzero coefficient, namely the   in front of  .

Example 1.5

In the vector space of two-wide row vectors, the two-element set   is linearly independent. To check this, set

 

and solving the resulting system

 

shows that both   and   are zero. So the only linear relationship between the two given row vectors is the trivial relationship.

In the same vector space,   is linearly dependent since we can satisfy

 

with   and  .

Remark 1.6

Recall the Statics example that began this book. We first set the unknown-mass objects at   cm and   cm and got a balance, and then we set the objects at   cm and   cm and got a balance. With those two pieces of information we could compute values of the unknown masses. Had we instead first set the unknown-mass objects at   cm and   cm, and then at   cm and   cm, we would not have been able to compute the values of the unknown masses (try it). Intuitively, the problem is that the   information is a "repeat" of the   information— that is,   is in the span of the set  — and so we would be trying to solve a two-unknowns problem with what is essentially one piece of information.

Example 1.7

The set   is linearly independent in  , the space of quadratic polynomials with real coefficients, because

 

gives

 

since polynomials are equal only if their coefficients are equal. Thus, the only linear relationship between these two members of   is the trivial one.

Example 1.8

In  , where

 

the set   is linearly dependent because this is a relationship

 

where not all of the scalars are zero (the fact that some of the scalars are zero doesn't matter).

Remark 1.9

That example illustrates why, although Definition 1.3 is a clearer statement of what independence is, Lemma 1.4 is more useful for computations. Working straight from the definition, someone trying to compute whether   is linearly independent would start by setting   and concluding that there are no such   and  . But knowing that the first vector is not dependent on the other two is not enough. This person would have to go on to try   to find the dependence  ,  . Lemma 1.4 gets the same conclusion with only one computation.

Example 1.10

The empty subset of a vector space is linearly independent. There is no nontrivial linear relationship among its members as it has no members.

Example 1.11

In any vector space, any subset containing the zero vector is linearly dependent. For example, in the space   of quadratic polynomials, consider the subset  .

One way to see that this subset is linearly dependent is to use Lemma 1.4: we have  , and this is a nontrivial relationship as not all of the coefficients are zero. Another way to see that this subset is linearly dependent is to go straight to Definition 1.3: we can express the third member of the subset as a linear combination of the first two, namely,   is satisfied by taking   and   (in contrast to the lemma, the definition allows all of the coefficients to be zero).

(There is still another way to see that this subset is dependent that is subtler. The zero vector is equal to the trivial sum, that is, it is the sum of no vectors. So in a set containing the zero vector, there is an element that can be written as a combination of a collection of other vectors from the set, specifically, the zero vector can be written as a combination of the empty collection.)

The above examples, especially Example 1.5, underline the discussion that begins this section. The next result says that given a finite set, we can produce a linearly independent subset by discarding what Remark 1.6 calls "repeats".


Theorem 1.12

In a vector space, any finite subset has a linearly independent subset with the same span.

Proof

If the set   is linearly independent then   itself satisfies the statement, so assume that it is linearly dependent.

By the definition of dependence, there is a vector   that is a linear combination of the others. Call that vector  . Discard it— define the set  . By Lemma 1.1, the span does not shrink  .

Now, if   is linearly independent then we are finished. Otherwise iterate the prior paragraph: take a vector   that is a linear combination of other members of   and discard it to derive   such that  . Repeat this until a linearly independent set   appears; one must appear eventually because   is finite and the empty set is linearly independent. (Formally, this argument uses induction on  , the number of elements in the starting set. Problem 20 asks for the details.)

Example 1.13

This set spans  .

 

Looking for a linear relationship

 

gives a three equations/five unknowns linear system whose solution set can be parametrized in this way.


 

So   is linearly dependent. Setting   and   shows that the fifth vector is a linear combination of the first two. Thus, Lemma 1.1 says that discarding the fifth vector

 

leaves the span unchanged  . Now, the third vector of   is a linear combination of the first two and we get

 

with the same span as  , and therefore the same span as  , but with one difference. The set   is linearly independent (this is easily checked), and so discarding any of its elements will shrink the span.

Linear Independence and Subset Relations

edit

Theorem 1.12 describes producing a linearly independent set by shrinking, that is, by taking subsets. We finish this subsection by considering how linear independence and dependence, which are properties of sets, interact with the subset relation between sets.

Lemma 1.14

Any subset of a linearly independent set is also linearly independent. Any superset of a linearly dependent set is also linearly dependent.

Proof

This is clear.

Restated, independence is preserved by subset and dependence is preserved by superset.

Those are two of the four possible cases of interaction that we can consider. The third case, whether linear dependence is preserved by the subset operation, is covered by Example 1.13, which gives a linearly dependent set   with a subset   that is linearly dependent and another subset   that is linearly independent.

That leaves one case, whether linear independence is preserved by superset. The next example shows what can happen.

Example 1.15

In each of these three paragraphs the subset   is linearly independent.

For the set

 

the span   is the   axis. Here are two supersets of  , one linearly dependent and the other linearly independent.

dependent:        independent:  

Checking the dependence or independence of these sets is easy.

For

 

the span   is the   plane. These are two supersets.

dependent:        independent:  

If

 

then  . A linearly dependent superset is

dependent:  

but there are no linearly independent supersets of  . The reason is that for any vector that we would add to make a superset, the linear dependence equation


 

has a solution  ,  , and  .

So, in general, a linearly independent set may have a superset that is dependent. And, in general, a linearly independent set may have a superset that is independent. We can characterize when the superset is one and when it is the other.

Lemma 1.16

Where   is a linearly independent subset of a vector space  ,

 

for any   with  .

Proof

One implication is clear: if   then   where each   and  , and so   is a nontrivial linear relationship among elements of  .

The other implication requires the assumption that   is linearly independent. With   linearly dependent, there is a nontrivial linear relationship   and independence of   then implies that  , or else that would be a nontrivial relationship among members of  . Now rewriting this equation as   shows that  .

(Compare this result with Lemma 1.1. Both say, roughly, that   is a "repeat" if it is in the span of  . However, note the additional hypothesis here of linear independence.)

Corollary 1.17

A subset   of a vector space is linearly dependent if and only if some   is a linear combination of the vectors  , ...,   listed before it.

Proof

Consider  ,  ,  , etc. Some index   is the first one with   linearly dependent, and there  .

Lemma 1.16 can be restated in terms of independence instead of dependence: if   is linearly independent and   then the set   is also linearly independent if and only if   Applying Lemma 1.1, we conclude that if   is linearly independent and   then   is also linearly independent if and only if  . Briefly, when passing from   to a superset  , to preserve linear independence we must expand the span  .

Example 1.15 shows that some linearly independent sets are maximal— have as many elements as possible— in that they have no supersets that are linearly independent. By the prior paragraph, a linearly independent sets is maximal if and only if it spans the entire space, because then no vector exists that is not already in the span.

This table summarizes the interaction between the properties of independence and dependence and the relations of subset and superset.


   
  independent
  must be independent   may be either
  may be either   must be dependent
  dependent


In developing this table we've uncovered an intimate relationship between linear independence and span. Complementing the fact that a spanning set is minimal if and only if it is linearly independent, a linearly independent set is maximal if and only if it spans the space.

In summary, we have introduced the definition of linear independence to formalize the idea of the minimality of a spanning set. We have developed some properties of this idea. The most important is Lemma 1.16, which tells us that a linearly independent set is maximal when it spans the space.

Exercises

edit
This exercise is recommended for all readers.
Problem 1

Decide whether each subset of   is linearly dependent or linearly independent.

  1.  
  2.  
  3.  
  4.  
This exercise is recommended for all readers.
Problem 2

Which of these subsets of   are linearly dependent and which are independent?

  1.  
  2.  
  3.  
  4.  
This exercise is recommended for all readers.
Problem 3

Prove that each set   is linearly independent in the vector space of all functions from   to  .

  1.   and  
  2.   and  
  3.   and  
This exercise is recommended for all readers.
Problem 4

Which of these subsets of the space of real-valued functions of one real variable is linearly dependent and which is linearly independent? (Note that we have abbreviated some constant functions; e.g., in the first item, the " " stands for the constant function  .)

  1.  
  2.  
  3.  
  4.  
  5.  
  6.  
Problem 5

Does the equation   show that this set of functions   is a linearly dependent subset of the set of all real-valued functions with domain the interval   of real numbers between   and  ?

Problem 6

Why does Lemma 1.4 say "distinct"?

This exercise is recommended for all readers.
Problem 7

Show that the nonzero rows of an echelon form matrix form a linearly independent set.

This exercise is recommended for all readers.
Problem 8
  1. Show that if the set   is linearly independent set then so is the set  .
  2. What is the relationship between the linear independence or dependence of the set   and the independence or dependence of  ?
Problem 9

Example 1.10 shows that the empty set is linearly independent.

  1. When is a one-element set linearly independent?
  2. How about a set with two elements?
Problem 10

In any vector space  , the empty set is linearly independent. What about all of  ?

Problem 11

Show that if   is linearly independent then so are all of its proper subsets:  ,  ,  ,  , ,  , and  . Is that "only if" also?

Problem 12
  1. Show that this
     
    is a linearly independent subset of  .
  2. Show that
     
    is in the span of   by finding   and   giving a linear relationship.
     
    Show that the pair   is unique.
  3. Assume that   is a subset of a vector space and that   is in  , so that   is a linear combination of vectors from  . Prove that if   is linearly independent then a linear combination of vectors from   adding to   is unique (that is, unique up to reordering and adding or taking away terms of the form  ). Thus   as a spanning set is minimal in this strong sense: each vector in   is "hit" a minimum number of times— only once.
  4. Prove that it can happen when   is not linearly independent that distinct linear combinations sum to the same vector.
Problem 13

Prove that a polynomial gives rise to the zero function if and only if it is the zero polynomial. (Comment. This question is not a Linear Algebra matter, but we often use the result. A polynomial gives rise to a function in the obvious way:  .)

Problem 14

Return to Section 1.2 and redefine point, line, plane, and other linear surfaces to avoid degenerate cases.

Problem 15
  1. Show that any set of four vectors in   is linearly dependent.
  2. Is this true for any set of five? Any set of three?
  3. What is the most number of elements that a linearly independent subset of   can have?
This exercise is recommended for all readers.
Problem 16

Is there a set of four vectors in  , any three of which form a linearly independent set?

Problem 17

Must every linearly dependent set have a subset that is dependent and a subset that is independent?

Problem 18

In  , what is the biggest linearly independent set you can find? The smallest? The biggest linearly dependent set? The smallest? ("Biggest" and "smallest" mean that there are no supersets or subsets with the same property.)

This exercise is recommended for all readers.
Problem 19

Linear independence and linear dependence are properties of sets. We can thus naturally ask how those properties act with respect to the familiar elementary set relations and operations. In this body of this subsection we have covered the subset and superset relations. We can also consider the operations of intersection, complementation, and union.

  1. How does linear independence relate to intersection: can an intersection of linearly independent sets be independent? Must it be?
  2. How does linear independence relate to complementation?
  3. Show that the union of two linearly independent sets need not be linearly independent.
  4. Characterize when the union of two linearly independent sets is linearly independent, in terms of the intersection of the span of each.
This exercise is recommended for all readers.
Problem 20

For Theorem 1.12,

  1. fill in the induction for the proof;
  2. give an alternate proof that starts with the empty set and builds a sequence of linearly independent subsets of the given finite set until one appears with the same span as the given set.
Problem 21

With a little calculation we can get formulas to determine whether or not a set of vectors is linearly independent.

  1. Show that this subset of  
     
    is linearly independent if and only if  .
  2. Show that this subset of  
     
    is linearly independent iff  .
  3. When is this subset of  
     
    linearly independent?
  4. This is an opinion question: for a set of four vectors from  , must there be a formula involving the sixteen entries that determines independence of the set? (You needn't produce such a formula, just decide if one exists.)
This exercise is recommended for all readers.
Problem 22
  1. Prove that a set of two perpendicular nonzero vectors from   is linearly independent when  .
  2. What if  ?  ?
  3. Generalize to more than two vectors.
Problem 23

Consider the set of functions from the open interval   to  .

  1. Show that this set is a vector space under the usual operations.
  2. Recall the formula for the sum of an infinite geometric series:   for all  . Why does this not express a dependence inside of the set   (in the vector space that we are considering)? (Hint. Review the definition of linear combination.)
  3. Show that the set in the prior item is linearly independent.

This shows that some vector spaces exist with linearly independent subsets that are infinite.

Problem 24

Show that, where   is a subspace of  , if a subset   of   is linearly independent in   then   is also linearly independent in  . Is that "only if"?

Solutions

Linear Algebra
 ← Linear Independence Definition and Examples of Linear Independence Basis and Dimension →