Decide whether each subset of is linearly dependent or linearly independent.
Answer
For each of these, when the subset is independent it must be proved, and when the subset is dependent an example of a dependence must be given.
It is dependent.
Considering
gives rise to this linear system.
Gauss' method
yields a free variable, so there are infinitely many solutions. For an example of a particular dependence we can set to be, say, . Then we get and .
It is dependent. The linear system that arises here
has infinitely many solutions. We can get a particular solution by taking to be, say, , and back-substituting to get the resulting and .
It is linearly independent. The system
has only the solution and . (We could also have gotten the answer by inspection— the second vector is obviously not a multiple of the first, and vice versa.)
It is linearly dependent. The linear system
has more unknowns than equations, and so Gauss' method must end with at least one variable free (there can't be a contradictory equation because the system is homogeneous, and so has at least the solution of all zeroes). To exhibit a combination, we can do the reduction
and take, say, . Then we have that , , and .
This exercise is recommended for all readers.
Problem 2
Which of these subsets of are linearly dependent and which are independent?
Answer
In the cases of independence, that must be proved. Otherwise, a specific dependence must be produced. (Of course, dependences other than the ones exhibited here are possible.)
This set is independent. Setting up the relation gives a linear system
with only one solution: , , and .
This set is independent. We can see this by inspection, straight from the definition of linear independence. Obviously neither is a multiple of the other.
This set is linearly independent. The linear system reduces in this way
to show that there is only the solution , , and .
This set is linearly dependent. The linear system
must, after reduction, end with at least one variable free (there are more variables than equations, and there is no possibility of a contradictory equation because the system is homogeneous). We can take the free variables as parameters to describe the solution set. We can then set the parameter to a nonzero value to get a nontrivial linear relation.
This exercise is recommended for all readers.
Problem 3
Prove that each set is linearly independent in the vector space of all functions from to .
and
and
and
Answer
Let be the zero function , which is the additive identity in the vector space under discussion.
This set is linearly independent. Consider . Plugging in and gives a linear system
with the unique solution , .
This set is linearly independent. Consider and plug in and to get
which obviously gives that , .
This set is also linearly independent. Considering and plugging in and
gives that and .
This exercise is recommended for all readers.
Problem 4
Which of these subsets of the space of real-valued functions of one real variable is linearly dependent and which is linearly independent? (Note that we have abbreviated some constant functions; e.g., in the first item, the "" stands for the constant function .)
Answer
In each case, that the set is independent must be proved, and that it is dependent must be shown by exhibiting a specific dependence.
This set is dependent. The familiar relation shows that is satisfied by and .
This set is independent. Consider the relationship (that "" is the zero function).
Taking , and gives this system.
whose only solution is , , and .
By inspection, this set is independent. Any dependence is not possible since the cosine function is not a multiple of the identity function (we are applying Corollary 1.17).
By inspection, we spot that there is a dependence. Because , we get that is satisfied by and .
This set is dependent. The easiest way to see that is to recall the trigonometric relationship . (Remark. A person who doesn't recall this, and tries some 's, simply never gets a system leading to a unique solution, and never gets to conclude that the set is independent. Of course, this person might wonder if they simply never tried the right set of 's, but a few tries will lead most people to look instead for a dependence.)
This set is dependent, because it contains the zero object in the vector space, the zero polynomial.
Problem 5
Does the equation show that this set of functions is a linearly dependent subset of the set of all real-valued functions with domain the interval of real numbers between and ?
Answer
No, that equation is not a linear relationship. In fact this set is independent, as the system arising from taking to be , and shows.
To emphasize that the equation does not make the set dependent.
This exercise is recommended for all readers.
Problem 7
Show that the nonzero rows of an echelon form matrix form a linearly independent set.
Answer
We have already showed this: the Linear Combination Lemma and its corollary state that in an echelon form matrix, no nonzero row is a linear combination of the others.
This exercise is recommended for all readers.
Problem 8
Show that if the set is linearly independent set then so is the set
.
What is the relationship between the linear independence or dependence of the set and the independence or dependence of
?
Answer
Assume that the set is linearly independent, so that any relationship leads to the conclusion that , , and .
Consider the relationship . Rewrite it to get . Taking to be , taking to be , and taking to be we have this system.
Conclusion: the 's are all zero, and so the set is linearly independent.
The second set is dependent
whether or not the first set is independent.
Problem 9
Example 1.10 shows that the empty set is linearly independent.
When is a one-element set linearly independent?
How about a set with two elements?
Answer
A singleton set is linearly independent if and only if . For the "if" direction, with , we can apply Lemma 1.4 by considering the relationship and noting that the only solution is the trivial one: . For the "only if" direction, just recall that Example 1.11 shows that is linearly dependent, and so if the set is linearly independent then .
(Remark. Another answer is to say that this is the special case of Lemma 1.16 where .)
A set with two elements is linearly independent if and only if neither member is a multiple of the other (note that if one is the zero vector then it is a multiple of the other, so this case is covered). This is an equivalent statement: a set is linearly dependent if and only if one element is a multiple of the other.
The proof is easy. A set is linearly dependent if and only if there is a relationship with either or (or both). That holds if and only if or (or both).
Problem 10
In any vector space , the empty set is linearly independent. What about all of ?
Answer
This set is linearly dependent set because it contains the zero vector.
Problem 11
Show that if is linearly independent then so are all of its proper subsets: , , , ,, , and . Is that "only if" also?
Answer
The "if" half is given by Lemma 1.14. The converse (the "only if" statement) does not hold. An example is to consider the vector space and these vectors.
Problem 12
Show that this
is a linearly independent subset of .
Show that
is in the span of by finding and giving a linear relationship.
Show that the pair is unique.
Assume that is a subset of a vector space and that is in , so that is a linear combination of vectors from . Prove that if is linearly independent then a linear combination of vectors from adding to is unique (that is, unique up to reordering and adding or taking away terms of the form ). Thus as a spanning set is minimal in this strong sense: each vector in is "hit" a minimum number of times— only once.
Prove that it can happen when is not linearly independent that distinct linear combinations sum to the same vector.
Answer
The linear system arising from
has the unique solution and .
The linear system arising from
has the unique solution and .
Suppose that is linearly independent. Suppose that we have both and (where the vectors are members of ). Now,
can be rewritten in this way.
Possibly some of the 's equal some of the 's; we can combine the associated coefficients (i.e., if then can be rewritten as ). That equation is a linear relationship among distinct (after the combining is done) members of the set . We've assumed that is linearly independent, so all of the coefficients are zero. If is such that does not equal any then is zero. If is such that does not equal any then is zero. In the final case, we have that and so .
Therefore, the original two sums are the same, except perhaps for some or terms that we can neglect.
This set is not linearly independent:
and these two linear combinations give the same result
Thus, a linearly dependent set might have indistinct sums.
In fact, this stronger statement holds: if a set is linearly dependent then it must have the property that there are two distinct linear combinations that sum to the same vector. Briefly, where then multiplying both sides of the relationship by two gives another relationship.
If the first relationship is nontrivial then the second is also.
Problem 13
Prove that a polynomial gives rise to the zero function if and only if it is the zero polynomial. (Comment. This question is not a Linear Algebra matter, but we often use the result. A polynomial gives rise to a function in the obvious way: .)
Answer
In this "if and only if" statement, the "if" half is clear— if the polynomial is the zero polynomial then the function that arises from the action of the polynomial must be the zero function . For "only if" we write . Plugging in zero gives that . Taking the derivative and plugging in zero gives that . Similarly we get that each is zero, and is the zero
polynomial.
Problem 14
Return to Section 1.2 and redefine point, line, plane, and other linear surfaces to avoid degenerate cases.
Answer
The work in this section suggests that an -dimensional non-degenerate linear surface should be defined as the span of a linearly independent set of vectors.
Problem 15
Show that any set of four vectors in is linearly dependent.
Is this true for any set of five? Any set of three?
What is the most number of elements that a linearly independent subset of can have?
Answer
For any , ..., ,
yields a linear system
that has infinitely many solutions (Gauss' method leaves at least two variables free). Hence there are nontrivial linear relationships among the given members of .
Any set five vectors is a superset of a set of four vectors, and so is linearly dependent.
With three vectors from , the argument from the prior item still applies, with the slight change that Gauss' method now only leaves at least one variable free (but that still gives infinitely many solutions).
The prior item shows that no three-element subset of is independent. We know that there are two-element subsets of that are independent— one is
and so the answer is two.
This exercise is recommended for all readers.
Problem 16
Is there a set of four vectors in , any three of which form a linearly independent set?
Answer
Yes; here is one.
Problem 17
Must every linearly dependent set have a subset that is dependent and a subset that is independent?
Answer
Yes. The two improper subsets, the entire set and the empty subset, serve as examples.
Problem 18
In , what is the biggest linearly independent set you can find? The smallest? The biggest linearly dependent set? The smallest? ("Biggest" and "smallest" mean that there are no supersets or subsets with the same property.)
Answer
In the biggest linearly independent set has four vectors. There are many examples of such sets, this is one.
To see that no set with five or more vectors can be independent, set up
and note that the resulting linear system
has four equations and five unknowns, so Gauss' method must end with at least one variable free, so there are infinitely many solutions, and so the above linear relationship among the four-tall vectors has more solutions than just the trivial solution.
The smallest linearly independent set is the empty set.
The biggest linearly dependent set is . The smallest is .
This exercise is recommended for all readers.
Problem 19
Linear independence and linear dependence are properties of sets. We can thus naturally ask how those properties act with respect to the familiar elementary set relations and operations. In this body of this subsection we have covered the subset and superset relations. We can also consider the operations of intersection, complementation, and union.
How does linear independence relate to intersection: can an intersection of linearly independent sets be independent? Must it be?
How does linear independence relate to complementation?
Show that the union of two linearly independent sets need not be linearly independent.
Characterize when the union of two linearly independent sets is linearly independent, in terms of the intersection of the span of each.
Answer
The intersection of two linearly independent sets must be linearly independent as it is a subset of the linearly independent set (as well as the linearly independent set also, of course).
The complement of a linearly independent set is linearly dependent as it contains the zero vector.
We must produce an example. One, in , is
since the linear dependence of is easily seen.
The union of two linearly independent sets is linearly independent if and only if their spans have a trivial intersection . To prove that, assume that and are linearly independent subsets of some vector space.
For the "only if" direction, assume that the intersection of the spans is trivial . Consider the set . Any linear relationship gives . The left side of that equation sums to a vector in , and the right side is a vector in . Therefore, since the intersection of the spans is trivial, both sides equal the zero vector. Because is linearly independent, all of the 's are zero. Because is linearly independent, all of the 's are zero. Thus, the original linear relationship among members of only holds if all of the coefficients are zero. That shows that is linearly independent.
For the "if" half we can make the same argument in reverse. If the union is linearly independent, that is, if the only solution to is the trivial solution , ..., , then any vector in the intersection of the spans must be the zero vector because each scalar is zero.
give an alternate proof that starts with the empty set and builds a sequence of linearly independent subsets of the given finite set until one appears with the same span as the given set.
Answer
We do induction on the number of vectors in the finite set .
The base case is that has no elements. In this case is linearly independent and there is nothing to check— a subset of that has the same span as is itself.
For the inductive step assume that the theorem is true for all sets of size , , ..., in order to prove that it holds when has elements. If the -element set is linearly independent then the theorem is trivial, so assume that it is dependent. By Corollary 1.17 there is an that is a linear combination of other vectors in . Define and note that has the same span as by Lemma 1.1. The set has elements and so the inductive hypothesis applies to give that it has a linearly independent subset with the same span. That subset of is the desired subset of .
Here is a sketch of the argument. The induction argument details have been left out.
If the finite set is empty then there is nothing to prove. If then the empty subset will do.
Otherwise, take some nonzero vector and define . If then this proof is finished by noting that is linearly independent.
If not, then there is a nonzero vector (if every is in then ). Define . If then this proof is finished by using Theorem 1.17 to show that is linearly independent.
Repeat the last paragraph until a set with a big enough span appears. That must eventually happen because is finite, and will be reached at worst when every vector from has been used.
Problem 21
With a little calculation we can get formulas to determine whether or not a set of vectors is linearly independent.
Show that this subset of
is linearly independent if and only if .
Show that this subset of
is linearly independent iff .
When is this subset of
linearly independent?
This is an opinion question: for a set of four vectors from , must there be a formula involving the sixteen entries that determines independence of the set? (You needn't produce such a formula, just decide if one exists.)
Answer
Assuming first that ,
gives
which has a solution if and only if (we've assumed in this case that , and so back substitution yields a unique solution).
The case is also not hard— break it into the and subcases and note that in these cases .
Comment. An earlier exercise showed that a two-vector set is linearly dependent if and only if either vector is a scalar multiple of the other. That can also be used to make the calculation.
The equation
gives rise to a homogeneous linear system. We proceed by writing it in matrix form and applying Gauss' method.
We first reduce the matrix to upper-triangular. Assume that .
(where we've assumed for the moment that in order to do the row reduction step). Then, under the assumptions, we get this.
shows that the original system is nonsingular if and only if the entry is nonzero. This fraction is defined because of the assumption, and it will equal zero if and only if its numerator equals zero.
We next worry about the assumptions. First, if but then we swap
and conclude that the system is nonsingular if and only if either or . That's the same as asking that their product be zero:
(in going from the first line to the second we've applied the case assumption that by substituting for ). Since we are assuming that , we have that . With we can rewrite this to fit the form we need: in this and case, the given system is nonsingular when , as required.
The remaining cases have the same character. Do the but case and the and but case by first swapping rows and then going on as above. The , , and case is easy— a set with a zero vector is linearly dependent, and the formula comes out to equal zero.
It is linearly dependent if and only if either vector is a multiple of the other. That is, it is not independent iff
(or both) for some scalars and . Eliminating and in order to restate this condition only in terms of the given letters , , , , , , we have that it is not independent— it is dependent— iff .
Dependence or independence is a function of the indices, so there is indeed a formula (although at first glance a person might think the formula involves cases: "if the first component of the first vector is zero then ...", this guess turns out not to be correct).
This exercise is recommended for all readers.
Problem 22
Prove that a set of two perpendicular nonzero vectors from is linearly independent when .
What if ? ?
Generalize to more than two vectors.
Answer
Recall that two vectors from are perpendicular if and only if their dot product is zero.
Assume that and are perpendicular nonzero vectors in , with . With the linear relationship , apply to both sides to conclude that . Because we have that . A similar application of shows that .
Two vectors in are perpendicular if and only if at least one of them is zero.
We define to be a trivial space, and so both and are the zero vector.
The right generalization is to look at a set of vectors that are mutually orthogonal (also called pairwise perpendicular): if then is perpendicular to . Mimicking the proof of the first item above shows that such a set of nonzero vectors is linearly independent.
Problem 23
Consider the set of functions from the open interval to .
Show that this set is a vector space under the usual operations.
Recall the formula for the sum of an infinite geometric series: for all . Why does this not express a dependence inside of the set (in the vector space that we are considering)? (Hint. Review the definition of linear combination.)
Show that the set in the prior item is linearly independent.
This shows that some vector spaces exist with linearly independent subsets that are infinite.
Answer
This check is routine.
The summation is infinite (has infinitely many summands). The definition of linear combination involves only finite sums.
No nontrivial finite sum of members of adds to the zero object: assume that
(any finite sum uses a highest power, here ). Multiply both sides by to conclude that each coefficient is zero, because a polynomial describes the zero function only when it is the zero polynomial.
Problem 24
Show that, where is a subspace of , if a subset of is linearly independent in then is also linearly independent in . Is that "only if"?
Answer
It is both "if" and "only if".
Let be a subset of the subspace of the vector space . The assertion that any linear relationship among members of must be the trivial relationship , ..., is a statement that holds in if and only if it holds in , because the subspace inherits its addition and scalar multiplication operations from .