# Linear Algebra/Basis/Solutions

## Solutions

This exercise is recommended for all readers.
Problem 1

Decide if each is a basis for $\mathbb {R} ^{3}$ .

1. $\langle {\begin{pmatrix}1\\2\\3\end{pmatrix}},{\begin{pmatrix}3\\2\\1\end{pmatrix}},{\begin{pmatrix}0\\0\\1\end{pmatrix}}\rangle$
2. $\langle {\begin{pmatrix}1\\2\\3\end{pmatrix}},{\begin{pmatrix}3\\2\\1\end{pmatrix}}\rangle$
3. $\langle {\begin{pmatrix}0\\2\\-1\end{pmatrix}},{\begin{pmatrix}1\\1\\1\end{pmatrix}},{\begin{pmatrix}2\\5\\0\end{pmatrix}}\rangle$
4. $\langle {\begin{pmatrix}0\\2\\-1\end{pmatrix}},{\begin{pmatrix}1\\1\\1\end{pmatrix}},{\begin{pmatrix}1\\3\\0\end{pmatrix}}\rangle$

By Theorem 1.12, each is a basis if and only if each vector in the space can be given in a unique way as a linear combination of the given vectors.

1. Yes this is a basis. The relation
$c_{1}{\begin{pmatrix}1\\2\\3\end{pmatrix}}+c_{2}{\begin{pmatrix}3\\2\\1\end{pmatrix}}+c_{3}{\begin{pmatrix}0\\0\\1\end{pmatrix}}={\begin{pmatrix}x\\y\\z\end{pmatrix}}$
gives
$\left({\begin{array}{*{3}{c}|c}1&3&0&x\\2&2&0&y\\3&1&1&z\end{array}}\right){\xrightarrow[{-3\rho _{1}+\rho _{3}}]{-2\rho _{1}+\rho _{2}}}\;{\xrightarrow[{}]{2\rho _{2}+\rho _{3}}}\left({\begin{array}{*{3}{c}|c}1&3&0&x\\0&-4&0&-2x+y\\0&0&1&x-2y+z\end{array}}\right)$
which has the unique solution $c_{3}=x-2y+z$ , $c_{2}=x/2-y/4$ , and $c_{1}=-x/2+3y/4$ .
2. This is not a basis. Setting it up as in the prior item
$c_{1}{\begin{pmatrix}1\\2\\3\end{pmatrix}}+c_{2}{\begin{pmatrix}3\\2\\1\end{pmatrix}}={\begin{pmatrix}x\\y\\z\end{pmatrix}}$
gives a linear system whose solution
$\left({\begin{array}{*{2}{c}|c}1&3&x\\2&2&y\\3&1&z\end{array}}\right){\xrightarrow[{-3\rho _{1}+\rho _{3}}]{-2\rho _{1}+\rho _{2}}}\;{\xrightarrow[{}]{2\rho _{2}+\rho _{3}}}\left({\begin{array}{*{2}{c}|c}1&3&x\\0&-4&-2x+y\\0&0&x-2y+z\end{array}}\right)$
is possible if and only if the three-tall vector's components $x$ , $y$ , and $z$  satisfy $x-2y+z=0$ . For instance, we can find the coefficients $c_{1}$  and $c_{2}$  that work when $x=1$ , $y=1$ , and $z=1$ . However, there are no $c$ 's that work for $x=1$ , $y=1$ , and $z=2$ . Thus this is not a basis; it does not span the space.
3. Yes, this is a basis. Setting up the relationship leads to this reduction
$\left({\begin{array}{*{3}{c}|c}0&1&2&x\\2&1&5&y\\-1&1&0&z\end{array}}\right){\xrightarrow[{}]{rho_{1}\leftrightarrow \rho _{3}}}\;{\xrightarrow[{}]{\rho _{1}+\rho _{2}}}\;{\xrightarrow[{}]{(1/3)\rho _{2}+\rho _{3}}}\left({\begin{array}{*{3}{c}|c}-1&1&0&z\\0&3&5&y+2z\\0&0&1/3&x-y/3-2z/3\end{array}}\right)$
which has a unique solution for each triple of components $x$ , $y$ , and $z$ .
4. No, this is not a basis. The reduction
$\left({\begin{array}{*{3}{c}|c}0&1&1&x\\2&1&3&y\\-1&1&0&z\end{array}}\right){\xrightarrow[{}]{rho_{1}\leftrightarrow \rho _{3}}}\;{\xrightarrow[{}]{2\rho _{1}+\rho _{2}}}{\xrightarrow[{}]{-1/3)\rho _{2}+\rho _{3}}}\left({\begin{array}{*{3}{c}|c}-1&1&0&z\\0&3&3&y+2z\\0&0&0&x-y/3-2z/3\end{array}}\right)$
which does not have a solution for each triple $x$ , $y$ , and $z$ . Instead, the span of the given set includes only those three-tall vectors where $x=y/3+2z/3$ .
This exercise is recommended for all readers.
Problem 2

Represent the vector with respect to the basis.

1. ${\begin{pmatrix}1\\2\end{pmatrix}}$ , $B=\langle {\begin{pmatrix}1\\1\end{pmatrix}},{\begin{pmatrix}-1\\1\end{pmatrix}}\rangle \subseteq \mathbb {R} ^{2}$
2. $x^{2}+x^{3}$ , $D=\langle 1,1+x,1+x+x^{2},1+x+x^{2}+x^{3}\rangle \subseteq {\mathcal {P}}_{3}$
3. ${\begin{pmatrix}0\\-1\\0\\1\end{pmatrix}}$ , ${\mathcal {E}}_{4}\subseteq \mathbb {R} ^{4}$
1. We solve
$c_{1}{\begin{pmatrix}1\\1\end{pmatrix}}+c_{2}{\begin{pmatrix}-1\\1\end{pmatrix}}={\begin{pmatrix}1\\2\end{pmatrix}}$
with
$\left({\begin{array}{*{2}{c}|c}1&-1&1\\1&1&2\end{array}}\right){\xrightarrow[{}]{\rho _{1}+\rho _{2}}}\left({\begin{array}{*{2}{c}|c}1&-1&1\\0&2&1\end{array}}\right)$
and conclude that $c_{2}=1/2$  and so $c_{1}=3/2$ . Thus, the representation is this.
${\rm {Rep}}_{B}({\begin{pmatrix}1\\2\end{pmatrix}})={\begin{pmatrix}3/2\\1/2\end{pmatrix}}_{B}$
2. The relationship $c_{1}\cdot (1)+c_{2}\cdot (1+x)+c_{3}\cdot (1+x+x^{2})+c_{4}\cdot (1+x+x^{2}+x^{3})=x^{2}+x^{3}$  is easily solved by eye to give that $c_{4}=1$ , $c_{3}=0$ , $c_{2}=-1$ , and $c_{1}=0$ .
${\rm {Rep}}_{D}(x^{2}+x^{3})={\begin{pmatrix}0\\-1\\0\\1\end{pmatrix}}_{D}$
3. ${\rm {Rep}}_{{\mathcal {E}}_{4}}({\begin{pmatrix}0\\-1\\0\\1\end{pmatrix}})={\begin{pmatrix}0\\-1\\0\\1\end{pmatrix}}_{{\mathcal {E}}_{4}}$
Problem 3

Find a basis for ${\mathcal {P}}_{2}$ , the space of all quadratic polynomials. Must any such basis contain a polynomial of each degree:~degree zero, degree one, and degree two?

One basis is $\langle 1,x,x^{2}\rangle$ . There are bases for ${\mathcal {P}}_{2}$  that do not contain any polynomials of degree one or degree zero. One is $\langle 1+x+x^{2},x+x^{2},x^{2}\rangle$ . (Every basis has at least one polynomial of degree two, though.)

Problem 4

Find a basis for the solution set of this system.

${\begin{array}{*{4}{rc}r}x_{1}&-&4x_{2}&+&3x_{3}&-&x_{4}&=&0\\2x_{1}&-&8x_{2}&+&6x_{3}&-&2x_{4}&=&0\end{array}}$

The reduction

$\left({\begin{array}{*{4}{c}|c}1&-4&3&-1&0\\2&-8&6&-2&0\end{array}}\right){\xrightarrow[{}]{2\rho _{1}+\rho _{2}}}\left({\begin{array}{*{4}{c}|c}1&-4&3&-1&0\\0&0&0&0&0\end{array}}\right)$

gives that the only condition is that $x_{1}=4x_{2}-3x_{3}+x_{4}$ . The solution set is

$\{{\begin{pmatrix}4x_{2}-3x_{3}+x_{4}\\x_{2}\\x_{3}\\x_{4}\end{pmatrix}}\,{\big |}\,x_{2},x_{3},x_{4}\in \mathbb {R} \}=\{x_{2}{\begin{pmatrix}4\\1\\0\\0\end{pmatrix}}+x_{3}{\begin{pmatrix}-3\\0\\1\\0\end{pmatrix}}+x_{4}{\begin{pmatrix}1\\0\\0\\1\end{pmatrix}}\,{\big |}\,x_{2},x_{3},x_{4}\in \mathbb {R} \}$

and so the obvious candidate for the basis is this.

$\langle {\begin{pmatrix}4\\1\\0\\0\end{pmatrix}},{\begin{pmatrix}-3\\0\\1\\0\end{pmatrix}},{\begin{pmatrix}1\\0\\0\\1\end{pmatrix}}\rangle$

We've shown that this spans the space, and showing it is also linearly independent is routine.

This exercise is recommended for all readers.
Problem 5

Find a basis for ${\mathcal {M}}_{2\!\times \!2}$ , the space of $2\!\times \!2$  matrices.

There are many bases. This is an easy one.

$\langle {\begin{pmatrix}1&0\\0&0\end{pmatrix}},{\begin{pmatrix}0&1\\0&0\end{pmatrix}},{\begin{pmatrix}0&0\\1&0\end{pmatrix}},{\begin{pmatrix}0&0\\0&1\end{pmatrix}}\rangle$
This exercise is recommended for all readers.
Problem 6

Find a basis for each.

1. The subspace $\{a_{2}x^{2}+a_{1}x+a_{0}\,{\big |}\,a_{2}-2a_{1}=a_{0}\}$  of ${\mathcal {P}}_{2}$
2. The space of three-wide row vectors whose first and second components add to zero
3. This subspace of the $2\!\times \!2$  matrices
$\{{\begin{pmatrix}a&b\\0&c\end{pmatrix}}\,{\big |}\,c-2b=0\}$

For each item, many answers are possible.

1. One way to proceed is to parametrize by expressing the $a_{2}$  as a combination of the other two $a_{2}=2a_{1}+a_{0}$ . Then $a_{2}x^{2}+a_{1}x+a_{0}$  is $(2a_{1}+a_{0})x^{2}+a_{1}x+a_{0}$  and
$\{(2a_{1}+a_{0})x^{2}+a_{1}x+a_{0}\,{\big |}\,a_{1},a_{0}\in \mathbb {R} \}=\{a_{1}\cdot (2x^{2}+x)+a_{0}\cdot (x^{2}+1)\,{\big |}\,a_{1},a_{0}\in \mathbb {R} \}$
suggests $\langle 2x^{2}+x,x^{2}+1\rangle$ . This only shows that it spans, but checking that it is linearly independent is routine.
2. Parametrize $\{{\begin{pmatrix}a&b&c\end{pmatrix}}\,{\big |}\,a+b=0\}$  to get $\{{\begin{pmatrix}-b&b&c\end{pmatrix}}\,{\big |}\,b,c\in \mathbb {R} \}$ , which suggests using the sequence $\langle {\begin{pmatrix}-1&1&0\end{pmatrix}},{\begin{pmatrix}0&0&1\end{pmatrix}}\rangle$ . We've shown that it spans, and checking that it is linearly independent is easy.
3. Rewriting
$\{{\begin{pmatrix}a&b\\0&2b\end{pmatrix}}\,{\big |}\,a,b\in \mathbb {R} \}=\{a\cdot {\begin{pmatrix}1&0\\0&0\end{pmatrix}}+b\cdot {\begin{pmatrix}0&1\\0&2\end{pmatrix}}\,{\big |}\,a,b\in \mathbb {R} \}$
suggests this for the basis.
$\langle {\begin{pmatrix}1&0\\0&0\end{pmatrix}},{\begin{pmatrix}0&1\\0&2\end{pmatrix}}\rangle$
Problem 7

Check Example 1.6.

We will show that the second is a basis; the first is similar. We will show this straight from the definition of a basis, because this example appears before Theorem 1.12.

To see that it is linearly independent, we set up $c_{1}\cdot (\cos \theta -\sin \theta )+c_{2}\cdot (2\cos \theta +3\sin \theta )=0\cos \theta +0\sin \theta$ . Taking $\theta =0$  and $\theta =\pi /2$  gives this system

${\begin{array}{*{2}{rc}r}c_{1}\cdot 1&+&c_{2}\cdot 2&=&0\\c_{1}\cdot (-1)&+&c_{2}\cdot 3&=&0\end{array}}\;{\xrightarrow[{}]{\rho _{1}+\rho _{2}}}\;{\begin{array}{*{2}{rc}r}c_{1}&+&2c_{2}&=&0\\&+&5c_{2}&=&0\end{array}}$

which shows that $c_{1}=0$  and $c_{2}=0$ .

The calculation for span is also easy; for any $x,y\in \mathbb {R}$ , we have that $c_{1}\cdot (\cos \theta -\sin \theta )+c_{2}\cdot (2\cos \theta +3\sin \theta )=x\cos \theta +y\sin \theta$  gives that $c_{2}=x/5+y/5$  and that $c_{1}=3x/5-2y/5$ , and so the span is the entire space.

This exercise is recommended for all readers.
Problem 8

Find the span of each set and then find a basis for that span.

1. $\{1+x,1+2x\}$  in ${\mathcal {P}}_{2}$
2. $\{2-2x,3+4x^{2}\}$  in ${\mathcal {P}}_{2}$
1. Asking which $a_{0}+a_{1}x+a_{2}x^{2}$  can be expressed as $c_{1}\cdot (1+x)+c_{2}\cdot (1+2x)$  gives rise to three linear equations, describing the coefficients of $x^{2}$ , $x$ , and the constants.
${\begin{array}{*{2}{rc}r}c_{1}&+&c_{2}&=&a_{0}\\c_{1}&+&2c_{2}&=&a_{1}\\&&0&=&a_{2}\end{array}}$
Gauss' method with back-substitution shows, provided that $a_{2}=0$ , that $c_{2}=-a_{0}+a_{1}$  and $c_{1}=2a_{0}-a_{1}$ . Thus, with $a_{2}=0$ , we can compute appropriate $c_{1}$  and $c_{2}$  for any $a_{0}$  and $a_{1}$ . So the span is the entire set of linear polynomials $\{a_{0}+a_{1}x\,{\big |}\,a_{0},a_{1}\in \mathbb {R} \}$ . Parametrizing that set $\{a_{0}\cdot 1+a_{1}\cdot x\,{\big |}\,a_{0},a_{1}\in \mathbb {R} \}$  suggests a basis $\langle 1,x\rangle$  (we've shown that it spans; checking linear independence is easy).
2. With
$a_{0}+a_{1}x+a_{2}x^{2}=c_{1}\cdot (2-2x)+c_{2}\cdot (3+4x^{2})=(2c_{1}+3c_{2})+(-2c_{1})x+(4c_{2})x^{2}$
we get this system.
${\begin{array}{*{2}{rc}r}2c_{1}&+&3c_{2}&=&a_{0}\\-2c_{1}&&&=&a_{1}\\&&4c_{2}&=&a_{2}\end{array}}\;{\xrightarrow[{}]{\rho _{1}+\rho _{2}}}\;{\xrightarrow[{}]{-4/3)\rho _{2}+\rho _{3}}}\;{\begin{array}{*{2}{rc}r}2c_{1}&+&3c_{2}&=&a_{0}\\&&3c_{2}&=&a_{0}+a_{1}\\&&0&=&(-4/3)a_{0}-(4/3)a_{1}+a_{2}\end{array}}$
Thus, the only quadratic polynomials $a_{0}+a_{1}x+a_{2}x^{2}$  with associated $c$ 's are the ones such that $0=(-4/3)a_{0}-(4/3)a_{1}+a_{2}$ . Hence the span is $\{(-a_{1}+(3/4)a_{2})+a_{1}x+a_{2}x^{2}\,{\big |}\,a_{1},a_{2}\in \mathbb {R} \}$ . Parametrizing gives $\{a_{1}\cdot (-1+x)+a_{2}\cdot ((3/4)+x^{2})\,{\big |}\,a_{1},a_{2}\in \mathbb {R} \}$ , which suggests $\langle -1+x,(3/4)+x^{2}\rangle$  (checking that it is linearly independent is routine).
This exercise is recommended for all readers.
Problem 9

Find a basis for each of these subspaces of the space ${\mathcal {P}}_{3}$  of cubic polynomials.

1. The subspace of cubic polynomials $p(x)$  such that $p(7)=0$
2. The subspace of polynomials $p(x)$  such that $p(7)=0$  and $p(5)=0$
3. The subspace of polynomials $p(x)$  such that $p(7)=0$ , $p(5)=0$ , and~$p(3)=0$
4. The space of polynomials $p(x)$  such that $p(7)=0$ , $p(5)=0$ , $p(3)=0$ , and~$p(1)=0$
1. The subspace is $\{a_{0}+a_{1}x+a_{2}x^{2}+a_{3}x^{3}\,{\big |}\,a_{0}+7a_{1}+49a_{2}+343a_{3}=0\}$ . Rewriting $a_{0}=-7a_{1}-49a_{2}-343a_{3}$  gives $\{(-7a_{1}-49a_{2}-343a_{3})+a_{1}x+a_{2}x^{2}+a_{3}x^{3}\,{\big |}\,a_{1},a_{2},a_{3}\in \mathbb {R} \}$ , which, on breaking out the parameters, suggests $\langle -7+x,-49+x^{2},-343+x^{3}\rangle$  for the basis (it is easily verified).
2. The given subspace is the collection of cubics $p(x)=a_{0}+a_{1}x+a_{2}x^{2}+a_{3}x^{3}$  such that $a_{0}+7a_{1}+49a_{2}+343a_{3}=0$  and $a_{0}+5a_{1}+25a_{2}+125a_{3}=0$ . Gauss' method
${\begin{array}{*{4}{rc}r}a_{0}&+&7a_{1}&+&49a_{2}&+&343a_{3}&=&0\\a_{0}&+&5a_{1}&+&25a_{2}&+&125a_{3}&=&0\end{array}}\;{\xrightarrow[{}]{\rho _{1}+\rho _{2}}}\;{\begin{array}{*{4}{rc}r}a_{0}&+&7a_{1}&+&49a_{2}&+&343a_{3}&=&0\\&&-2a_{1}&-&24a_{2}&-&218a_{3}&=&0\end{array}}$
gives that $a_{1}=-12a_{2}-109a_{3}$  and that $a_{0}=35a_{2}+420a_{3}$ . Rewriting $(35a_{2}+420a_{3})+(-12a_{2}-109a_{3})x+a_{2}x^{2}+a_{3}x^{3}$  as $a_{2}\cdot (35-12x+x^{2})+a_{3}\cdot (420-109x+x^{3})$  suggests this for a basis $\langle 35-12x+x^{2},420-109x+x^{3}\rangle$ . The above shows that it spans the space. Checking it is linearly independent is routine. (Comment. A worthwhile check is to verify that both polynomials in the basis have both seven and five as roots.)
3. Here there are three conditions on the cubics, that $a_{0}+7a_{1}+49a_{2}+343a_{3}=0$ , that $a_{0}+5a_{1}+25a_{2}+125a_{3}=0$ ,and that $a_{0}+3a_{1}+9a_{2}+27a_{3}=0$ . Gauss' method
${\begin{array}{*{4}{rc}r}a_{0}&+&7a_{1}&+&49a_{2}&+&343a_{3}&=&0\\a_{0}&+&5a_{1}&+&25a_{2}&+&125a_{3}&=&0\\a_{0}&+&3a_{1}&+&9a_{2}&+&27a_{3}&=&0\end{array}}\;{\xrightarrow[{-\rho _{1}+\rho _{3}}]{-\rho _{1}+\rho _{2}}}\;{\xrightarrow[{}]{2\rho _{2}+\rho _{3}}}\;{\begin{array}{*{4}{rc}r}a_{0}&+&7a_{1}&+&49a_{2}&+&343a_{3}&=&0\\&&-2a_{1}&-&24a_{2}&-&218a_{3}&=&0\\&&&&8a_{2}&+&120a_{3}&=&0\end{array}}$
yields the single free variable $a_{3}$ , with $a_{2}=-15a_{3}$ , $a_{1}=71a_{3}$ , and $a_{0}=-105a_{3}$ . The parametrization is this.
$\{(-105a_{3})+(71a_{3})x+(-15a_{3})x^{2}+(a_{3})x^{3}\,{\big |}\,a_{3}\in \mathbb {R} \}=\{a_{3}\cdot (-105+71x-15x^{2}+x^{3})\,{\big |}\,a_{3}\in \mathbb {R} \}$
Therefore, a good candidate for the basis is $\langle -105+71x-15x^{2}+x^{3}\rangle$ . It spans the space by the work above. It is clearly linearly independent because it is a one-element set (with that single element not the zero object of the space). Thus, any cubic through the three points $(7,0)$ , $(5,0)$ , and $(3,0)$  is a multiple of this one. (Comment. As in the prior question, a worthwhile check is to verify that plugging seven, five, and three into this polynomial yields zero each time.)
4. This is the trivial subspace of ${\mathcal {P}}_{3}$ . Thus, the basis is empty $\langle \rangle$ .

Remark. The polynomial in the third item could alternatively have been derived by multiplying out $(x-7)(x-5)(x-3)$ .

Problem 10

We've seen that it is possible for a basis to remain a basis when it is reordered. Must it always remain a basis?

Yes. Linear independence and span are unchanged by reordering.

Problem 11

Can a basis contain a zero vector?

No linearly independent set contains a zero vector.

This exercise is recommended for all readers.
Problem 12

Let $\langle {\vec {\beta }}_{1},{\vec {\beta }}_{2},{\vec {\beta }}_{3}\rangle$  be a basis for a vector space.

1. Show that $\langle c_{1}{\vec {\beta }}_{1},c_{2}{\vec {\beta }}_{2},c_{3}{\vec {\beta }}_{3}\rangle$  is a basis when $c_{1},c_{2},c_{3}\neq 0$ . What happens when at least one $c_{i}$  is $0$ ?
2. Prove that $\langle {\vec {\alpha }}_{1},{\vec {\alpha }}_{2},{\vec {\alpha }}_{3}\rangle$  is a basis where ${\vec {\alpha }}_{i}={\vec {\beta }}_{1}+{\vec {\beta }}_{i}$ .
1. To show that it is linearly independent, note that $d_{1}(c_{1}{\vec {\beta }}_{1})+d_{2}(c_{2}{\vec {\beta }}_{2})+d_{3}(c_{3}{\vec {\beta }}_{3})={\vec {0}}$  gives that $(d_{1}c_{1}){\vec {\beta }}_{1}+(d_{2}c_{2}){\vec {\beta }}_{2}+(d_{3}c_{3}){\vec {\beta }}_{3}={\vec {0}}$ , which in turn implies that each $d_{i}c_{i}$  is zero. But with $c_{i}\neq 0$  that means that each $d_{i}$  is zero. Showing that it spans the space is much the same; because $\langle {\vec {\beta }}_{1},{\vec {\beta }}_{2},{\vec {\beta }}_{3}\rangle$  is a basis, and so spans the space, we can for any ${\vec {v}}$  write ${\vec {v}}=d_{1}{\vec {\beta }}_{1}+d_{2}{\vec {\beta }}_{2}+d_{3}{\vec {\beta }}_{3}$ , and then ${\vec {v}}=(d_{1}/c_{1})(c_{1}{\vec {\beta }}_{1})+(d_{2}/c_{2})(c_{2}{\vec {\beta }}_{2})+(d_{3}/c_{3})(c_{3}{\vec {\beta }}_{3})$ . If any of the scalars are zero then the result is not a basis, because it is not linearly independent.
2. Showing that $\langle 2{\vec {\beta }}_{1},{\vec {\beta }}_{1}+{\vec {\beta }}_{2},{\vec {\beta }}_{1}+{\vec {\beta }}_{3}\rangle$  is linearly independent is easy. To show that it spans the space, assume that ${\vec {v}}=d_{1}{\vec {\beta }}_{1}+d_{2}{\vec {\beta }}_{2}+d_{3}{\vec {\beta }}_{3}$ . Then, we can represent the same ${\vec {v}}$  with respect to $\langle 2{\vec {\beta }}_{1},{\vec {\beta }}_{1}+{\vec {\beta }}_{2},{\vec {\beta }}_{1}+{\vec {\beta }}_{3}\rangle$  in this way ${\vec {v}}=(1/2)(d_{1}-d_{2}-d_{3})(2{\vec {\beta }}_{1})+d_{2}({\vec {\beta }}_{1}+{\vec {\beta }}_{2})+d_{3}({\vec {\beta }}_{1}+{\vec {\beta }}_{3})$ .
Problem 13

Find one vector ${\vec {v}}$  that will make each into a basis for the space.

1. $\langle {\begin{pmatrix}1\\1\end{pmatrix}},{\vec {v}}\rangle$  in $\mathbb {R} ^{2}$
2. $\langle {\begin{pmatrix}1\\1\\0\end{pmatrix}},{\begin{pmatrix}0\\1\\0\end{pmatrix}},{\vec {v}}\rangle$  in $\mathbb {R} ^{3}$
3. $\langle x,1+x^{2},{\vec {v}}\rangle$  in ${\mathcal {P}}_{2}$

Each forms a linearly independent set if ${\vec {v}}$  is omitted. To preserve linear independence, we must expand the span of each. That is, we must determine the span of each (leaving ${\vec {v}}$  out), and then pick a ${\vec {v}}$  lying outside of that span. Then to finish, we must check that the result spans the entire given space. Those checks are routine.

1. Any vector that is not a multiple of the given one, that is, any vector that is not on the line $y=x$  will do here. One is ${\vec {v}}={\vec {e}}_{1}$ .
2. By inspection, we notice that the vector ${\vec {e}}_{3}$  is not in the span of the set of the two given vectors. The check that the resulting set is a basis for $\mathbb {R} ^{3}$  is routine.
3. For any member of the span $\{c_{1}\cdot (x)+c_{2}\cdot (1+x^{2})\,{\big |}\,c_{1},c_{2}\in \mathbb {R} \}$ , the coefficient of $x^{2}$  equals the constant term. So we expand the span if we add a quadratic without this property, say, ${\vec {v}}=1-x^{2}$ . The check that the result is a basis for ${\mathcal {P}}_{2}$  is easy.
This exercise is recommended for all readers.
Problem 14

Where $\langle {\vec {\beta }}_{1},\dots ,{\vec {\beta }}_{n}\rangle$  is a basis, show that in this equation

$c_{1}{\vec {\beta }}_{1}+\dots +c_{k}{\vec {\beta }}_{k}=c_{k+1}{\vec {\beta }}_{k+1}+\dots +c_{n}{\vec {\beta }}_{n}$

each of the $c_{i}$ 's is zero. Generalize.

To show that each scalar is zero, simply subtract $c_{1}{\vec {\beta }}_{1}+\dots +c_{k}{\vec {\beta }}_{k}-c_{k+1}{\vec {\beta }}_{k+1}-\dots -c_{n}{\vec {\beta }}_{n}={\vec {0}}$ . The obvious generalization is that in any equation involving only the ${\vec {\beta }}$ 's, and in which each ${\vec {\beta }}$  appears only once, each scalar is zero. For instance, an equation with a combination of the even-indexed basis vectors (i.e., ${\vec {\beta }}_{2}$ , ${\vec {\beta }}_{4}$ , etc.) on the right and the odd-indexed basis vectors on the left also gives the conclusion that all of the coefficients are zero.

Problem 15

A basis contains some of the vectors from a vector space; can it contain them all?

No; no linearly independent set contains the zero vector.

Problem 16

Theorem 1.12 shows that, with respect to a basis, every linear combination is unique. If a subset is not a basis, can linear combinations be not unique? If so, must they be?

Here is a subset of $\mathbb {R} ^{2}$  that is not a basis, and two different linear combinations of its elements that sum to the same vector.

$\{{\begin{pmatrix}1\\2\end{pmatrix}},{\begin{pmatrix}2\\4\end{pmatrix}}\}\qquad 2\cdot {\begin{pmatrix}1\\2\end{pmatrix}}+0\cdot {\begin{pmatrix}2\\4\end{pmatrix}}=0\cdot {\begin{pmatrix}1\\2\end{pmatrix}}+1\cdot {\begin{pmatrix}2\\4\end{pmatrix}}$

Thus, when a subset is not a basis, it can be the case that its linear combinations are not unique.

But just because a subset is not a basis does not imply that its combinations must be not unique. For instance, this set

$\{{\begin{pmatrix}1\\2\end{pmatrix}}\}$

does have the property that

$c_{1}\cdot {\begin{pmatrix}1\\2\end{pmatrix}}=c_{2}\cdot {\begin{pmatrix}1\\2\end{pmatrix}}$

implies that $c_{1}=c_{2}$ . The idea here is that this subset fails to be a basis because it fails to span the space; the proof of the theorem establishes that linear combinations are unique if and only if the subset is linearly independent.

This exercise is recommended for all readers.
Problem 17

A square matrix is symmetric if for all indices $i$  and $j$ , entry $i,j$  equals entry $j,i$ .

1. Find a basis for the vector space of symmetric $2\!\times \!2$  matrices.
2. Find a basis for the space of symmetric $3\!\times \!3$  matrices.
3. Find a basis for the space of symmetric $n\!\times \!n$  matrices.
1. Describing the vector space as
$\{{\begin{pmatrix}a&b\\b&c\end{pmatrix}}\,{\big |}\,a,b,c\in \mathbb {R} \}$
suggests this for a basis.
$\langle {\begin{pmatrix}1&0\\0&0\end{pmatrix}},{\begin{pmatrix}0&0\\0&1\end{pmatrix}},{\begin{pmatrix}0&1\\1&0\end{pmatrix}}\rangle$
Verification is easy.
2. This is one possible basis.
$\langle {\begin{pmatrix}1&0&0\\0&0&0\\0&0&0\end{pmatrix}},{\begin{pmatrix}0&0&0\\0&1&0\\0&0&0\end{pmatrix}},{\begin{pmatrix}0&0&0\\0&0&0\\0&0&1\end{pmatrix}},{\begin{pmatrix}0&1&0\\1&0&0\\0&0&0\end{pmatrix}},{\begin{pmatrix}0&0&1\\0&0&0\\1&0&0\end{pmatrix}},{\begin{pmatrix}0&0&0\\0&0&1\\0&1&0\end{pmatrix}}\rangle$
3. As in the prior two questions, we can form a basis from two kinds of matrices. First are the matrices with a single one on the diagonal and all other entries zero (there are $n$  of those matrices). Second are the matrices with two opposed off-diagonal entries are ones and all other entries are zeros. (That is, all entries in $M$  are zero except that $m_{i,j}$  and $m_{j,i}$  are one.)
This exercise is recommended for all readers.
Problem 18

We can show that every basis for $\mathbb {R} ^{3}$  contains the same number of vectors.

1. Show that no linearly independent subset of $\mathbb {R} ^{3}$  contains more than three vectors.
2. Show that no spanning subset of $\mathbb {R} ^{3}$  contains fewer than three vectors. (Hint. Recall how to calculate the span of a set and show that this method, when applied to two vectors, cannot yield all of $\mathbb {R} ^{3}$ .)
1. Any four vectors from $\mathbb {R} ^{3}$  are linearly related because the vector equation
$c_{1}{\begin{pmatrix}x_{1}\\y_{1}\\z_{1}\end{pmatrix}}+c_{2}{\begin{pmatrix}x_{2}\\y_{2}\\z_{2}\end{pmatrix}}+c_{3}{\begin{pmatrix}x_{3}\\y_{3}\\z_{3}\end{pmatrix}}+c_{4}{\begin{pmatrix}x_{4}\\y_{4}\\z_{4}\end{pmatrix}}={\begin{pmatrix}0\\0\\0\end{pmatrix}}$
gives rise to a linear system
${\begin{array}{*{4}{rc}r}x_{1}c_{1}&+&x_{2}c_{2}&+&x_{3}c_{3}&+&x_{4}c_{4}&=&0\\y_{1}c_{1}&+&y_{2}c_{2}&+&y_{3}c_{3}&+&y_{4}c_{4}&=&0\\z_{1}c_{1}&+&z_{2}c_{2}&+&z_{3}c_{3}&+&z_{4}c_{4}&=&0\end{array}}$
that is homogeneous (and so has a solution) and has four unknowns but only three equations, and therefore has nontrivial solutions. (Of course, this argument applies to any subset of $\mathbb {R} ^{3}$  with four or more vectors.)
2. Given $x_{1}$ , ..., $z_{2}$ ,
$S=\{{\begin{pmatrix}x_{1}\\y_{1}\\z_{1}\end{pmatrix}},{\begin{pmatrix}x_{2}\\y_{2}\\z_{2}\end{pmatrix}}\}$
to decide which vectors
${\begin{pmatrix}x\\y\\z\end{pmatrix}}$
are in the span of $S$ , set up
$c_{1}{\begin{pmatrix}x_{1}\\y_{1}\\z_{1}\end{pmatrix}}+c_{2}{\begin{pmatrix}x_{2}\\y_{2}\\z_{2}\end{pmatrix}}={\begin{pmatrix}x\\y\\z\end{pmatrix}}$
and row reduce the resulting system.
${\begin{array}{*{2}{rc}r}x_{1}c_{1}&+&x_{2}c_{2}&=&x\\y_{1}c_{1}&+&y_{2}c_{2}&=&y\\z_{1}c_{1}&+&z_{2}c_{2}&=&z\end{array}}$
There are two variables $c_{1}$  and $c_{2}$  but three equations, so when Gauss' method finishes, on the bottomrow there will be some relationship of the form $0=m_{1}x+m_{2}y+m_{3}z$ . Hence, vectors in the span of the two-element set $S$  must satisfy some restriction. Hence the span is not all of $\mathbb {R} ^{3}$ .
Problem 19

One of the exercises in the Subspaces subsection shows that the set

$\{{\begin{pmatrix}x\\y\\z\end{pmatrix}}\,{\big |}\,x+y+z=1\}$

is a vector space under these operations.

${\begin{pmatrix}x_{1}\\y_{1}\\z_{1}\end{pmatrix}}+{\begin{pmatrix}x_{2}\\y_{2}\\z_{2}\end{pmatrix}}={\begin{pmatrix}x_{1}+x_{2}-1\\y_{1}+y_{2}\\z_{1}+z_{2}\end{pmatrix}}\qquad r{\begin{pmatrix}x\\y\\z\end{pmatrix}}={\begin{pmatrix}rx-r+1\\ry\\rz\end{pmatrix}}$

Find a basis.

We have (using these peculiar operations with care)

$\{{\begin{pmatrix}1-y-z\\y\\z\end{pmatrix}}\,{\big |}\,y,z\in \mathbb {R} \}=\{{\begin{pmatrix}-y+1\\y\\0\end{pmatrix}}+{\begin{pmatrix}-z+1\\0\\z\end{pmatrix}}\,{\big |}\,y,z\in \mathbb {R} \}=\{y\cdot {\begin{pmatrix}0\\1\\0\end{pmatrix}}+z\cdot {\begin{pmatrix}0\\0\\1\end{pmatrix}}\,{\big |}\,y,z\in \mathbb {R} \}$

and so a good candidate for a basis is this.

$\langle {\begin{pmatrix}0\\1\\0\end{pmatrix}},{\begin{pmatrix}0\\0\\1\end{pmatrix}}\rangle$

To check linear independence we set up

$c_{1}{\begin{pmatrix}0\\1\\0\end{pmatrix}}+c_{2}{\begin{pmatrix}0\\0\\1\end{pmatrix}}={\begin{pmatrix}1\\0\\0\end{pmatrix}}$

(the vector on the right is the zero object in this space). That yields the linear system

${\begin{array}{*{3}{rc}r}(-c_{1}+1)&+&(-c_{2}+1)&-&1&=&1\\c_{1}&&&&&=&0\\&&c_{2}&&&=&0\end{array}}$

with only the solution $c_{1}=0$  and $c_{2}=0$ . Checking the span is similar.