Linear Algebra/Basis/Solutions

Solutions

edit
This exercise is recommended for all readers.
Problem 1

Decide if each is a basis for  .

  1.  
  2.  
  3.  
  4.  
Answer

By Theorem 1.12, each is a basis if and only if each vector in the space can be given in a unique way as a linear combination of the given vectors.

  1. Yes this is a basis. The relation
     
    gives
     
    which has the unique solution  ,  , and  .
  2. This is not a basis. Setting it up as in the prior item
     
    gives a linear system whose solution
     
    is possible if and only if the three-tall vector's components  ,  , and   satisfy  . For instance, we can find the coefficients   and   that work when  ,  , and  . However, there are no  's that work for  ,  , and  . Thus this is not a basis; it does not span the space.
  3. Yes, this is a basis. Setting up the relationship leads to this reduction
     
    which has a unique solution for each triple of components  ,  , and  .
  4. No, this is not a basis. The reduction
     
    which does not have a solution for each triple  ,  , and  . Instead, the span of the given set includes only those three-tall vectors where  .
This exercise is recommended for all readers.
Problem 2

Represent the vector with respect to the basis.

  1.  ,  
  2.  ,  
  3.  ,  
Answer
  1. We solve
     
    with
     
    and conclude that   and so  . Thus, the representation is this.
     
  2. The relationship   is easily solved by eye to give that  ,  ,  , and  .
     
  3.  
Problem 3

Find a basis for  , the space of all quadratic polynomials. Must any such basis contain a polynomial of each degree:~degree zero, degree one, and degree two?

Answer

One basis is  . There are bases for   that do not contain any polynomials of degree one or degree zero. One is  . (Every basis has at least one polynomial of degree two, though.)

Problem 4

Find a basis for the solution set of this system.

 
Answer

The reduction

 

gives that the only condition is that  . The solution set is

 

and so the obvious candidate for the basis is this.

 

We've shown that this spans the space, and showing it is also linearly independent is routine.

This exercise is recommended for all readers.
Problem 5

Find a basis for  , the space of   matrices.

Answer

There are many bases. This is an easy one.

 
This exercise is recommended for all readers.
Problem 6

Find a basis for each.

  1. The subspace   of  
  2. The space of three-wide row vectors whose first and second components add to zero
  3. This subspace of the   matrices
     
Answer

For each item, many answers are possible.

  1. One way to proceed is to parametrize by expressing the   as a combination of the other two  . Then   is   and
     
    suggests  . This only shows that it spans, but checking that it is linearly independent is routine.
  2. Parametrize   to get  , which suggests using the sequence  . We've shown that it spans, and checking that it is linearly independent is easy.
  3. Rewriting
     
    suggests this for the basis.
     
Problem 7

Check Example 1.6.

Answer

We will show that the second is a basis; the first is similar. We will show this straight from the definition of a basis, because this example appears before Theorem 1.12.

To see that it is linearly independent, we set up  . Taking   and   gives this system

 

which shows that   and  .

The calculation for span is also easy; for any  , we have that   gives that   and that  , and so the span is the entire space.

This exercise is recommended for all readers.
Problem 8

Find the span of each set and then find a basis for that span.

  1.   in  
  2.   in  
Answer
  1. Asking which   can be expressed as   gives rise to three linear equations, describing the coefficients of  ,  , and the constants.
     
    Gauss' method with back-substitution shows, provided that  , that   and  . Thus, with  , we can compute appropriate   and   for any   and  . So the span is the entire set of linear polynomials  . Parametrizing that set   suggests a basis   (we've shown that it spans; checking linear independence is easy).
  2. With
     
    we get this system.
     
    Thus, the only quadratic polynomials   with associated  's are the ones such that  . Hence the span is  . Parametrizing gives  , which suggests   (checking that it is linearly independent is routine).
This exercise is recommended for all readers.
Problem 9

Find a basis for each of these subspaces of the space   of cubic polynomials.

  1. The subspace of cubic polynomials   such that  
  2. The subspace of polynomials   such that   and  
  3. The subspace of polynomials   such that  ,  , and~ 
  4. The space of polynomials   such that  ,  ,  , and~ 
Answer
  1. The subspace is  . Rewriting   gives  , which, on breaking out the parameters, suggests   for the basis (it is easily verified).
  2. The given subspace is the collection of cubics   such that   and  . Gauss' method
     
    gives that   and that  . Rewriting   as   suggests this for a basis  . The above shows that it spans the space. Checking it is linearly independent is routine. (Comment. A worthwhile check is to verify that both polynomials in the basis have both seven and five as roots.)
  3. Here there are three conditions on the cubics, that  , that  ,and that  . Gauss' method
     
    yields the single free variable  , with  ,  , and  . The parametrization is this.
     
    Therefore, a good candidate for the basis is  . It spans the space by the work above. It is clearly linearly independent because it is a one-element set (with that single element not the zero object of the space). Thus, any cubic through the three points  ,  , and   is a multiple of this one. (Comment. As in the prior question, a worthwhile check is to verify that plugging seven, five, and three into this polynomial yields zero each time.)
  4. This is the trivial subspace of  . Thus, the basis is empty  .

Remark. The polynomial in the third item could alternatively have been derived by multiplying out  .

Problem 10

We've seen that it is possible for a basis to remain a basis when it is reordered. Must it always remain a basis?

Answer

Yes. Linear independence and span are unchanged by reordering.

Problem 11

Can a basis contain a zero vector?

Answer

No linearly independent set contains a zero vector.

This exercise is recommended for all readers.
Problem 12

Let   be a basis for a vector space.

  1. Show that   is a basis when  . What happens when at least one   is  ?
  2. Prove that   is a basis where  .
Answer
  1. To show that it is linearly independent, note that   gives that  , which in turn implies that each   is zero. But with   that means that each   is zero. Showing that it spans the space is much the same; because   is a basis, and so spans the space, we can for any   write  , and then  . If any of the scalars are zero then the result is not a basis, because it is not linearly independent.
  2. Showing that   is linearly independent is easy. To show that it spans the space, assume that  . Then, we can represent the same   with respect to   in this way  .
Problem 13

Find one vector   that will make each into a basis for the space.

  1.   in  
  2.   in  
  3.   in  
Answer

Each forms a linearly independent set if   is omitted. To preserve linear independence, we must expand the span of each. That is, we must determine the span of each (leaving   out), and then pick a   lying outside of that span. Then to finish, we must check that the result spans the entire given space. Those checks are routine.

  1. Any vector that is not a multiple of the given one, that is, any vector that is not on the line   will do here. One is  .
  2. By inspection, we notice that the vector   is not in the span of the set of the two given vectors. The check that the resulting set is a basis for   is routine.
  3. For any member of the span  , the coefficient of   equals the constant term. So we expand the span if we add a quadratic without this property, say,  . The check that the result is a basis for   is easy.
This exercise is recommended for all readers.
Problem 14

Where   is a basis, show that in this equation

 

each of the  's is zero. Generalize.

Answer

To show that each scalar is zero, simply subtract  . The obvious generalization is that in any equation involving only the  's, and in which each   appears only once, each scalar is zero. For instance, an equation with a combination of the even-indexed basis vectors (i.e.,  ,  , etc.) on the right and the odd-indexed basis vectors on the left also gives the conclusion that all of the coefficients are zero.

Problem 15

A basis contains some of the vectors from a vector space; can it contain them all?

Answer

No; no linearly independent set contains the zero vector.

Problem 16

Theorem 1.12 shows that, with respect to a basis, every linear combination is unique. If a subset is not a basis, can linear combinations be not unique? If so, must they be?

Answer

Here is a subset of   that is not a basis, and two different linear combinations of its elements that sum to the same vector.

 

Thus, when a subset is not a basis, it can be the case that its linear combinations are not unique.

But just because a subset is not a basis does not imply that its combinations must be not unique. For instance, this set

 

does have the property that

 

implies that  . The idea here is that this subset fails to be a basis because it fails to span the space; the proof of the theorem establishes that linear combinations are unique if and only if the subset is linearly independent.

This exercise is recommended for all readers.
Problem 17

A square matrix is symmetric if for all indices   and  , entry   equals entry  .

  1. Find a basis for the vector space of symmetric   matrices.
  2. Find a basis for the space of symmetric   matrices.
  3. Find a basis for the space of symmetric   matrices.
Answer
  1. Describing the vector space as
     
    suggests this for a basis.
     
    Verification is easy.
  2. This is one possible basis.
     
  3. As in the prior two questions, we can form a basis from two kinds of matrices. First are the matrices with a single one on the diagonal and all other entries zero (there are   of those matrices). Second are the matrices with two opposed off-diagonal entries are ones and all other entries are zeros. (That is, all entries in   are zero except that   and   are one.)
This exercise is recommended for all readers.
Problem 18

We can show that every basis for   contains the same number of vectors.

  1. Show that no linearly independent subset of   contains more than three vectors.
  2. Show that no spanning subset of   contains fewer than three vectors. (Hint. Recall how to calculate the span of a set and show that this method, when applied to two vectors, cannot yield all of  .)
Answer
  1. Any four vectors from   are linearly related because the vector equation
     
    gives rise to a linear system
     
    that is homogeneous (and so has a solution) and has four unknowns but only three equations, and therefore has nontrivial solutions. (Of course, this argument applies to any subset of   with four or more vectors.)
  2. Given  , ...,  ,
     
    to decide which vectors
     
    are in the span of  , set up
     
    and row reduce the resulting system.
     
    There are two variables   and   but three equations, so when Gauss' method finishes, on the bottomrow there will be some relationship of the form  . Hence, vectors in the span of the two-element set   must satisfy some restriction. Hence the span is not all of  .
Problem 19

One of the exercises in the Subspaces subsection shows that the set

 

is a vector space under these operations.

 

Find a basis.

Answer

We have (using these peculiar operations with care)

 

and so a good candidate for a basis is this.

 

To check linear independence we set up

 

(the vector on the right is the zero object in this space). That yields the linear system

 

with only the solution   and  . Checking the span is similar.