# Linear Algebra/Dimension/Solutions

## Solutions

Assume that all spaces are finite-dimensional unless otherwise stated.

This exercise is recommended for all readers.
Problem 1

Find a basis for, and the dimension of, ${\mathcal {P}}_{2}$ .

One basis is $\langle 1,x,x^{2}\rangle$ , and so the dimension is three.

Problem 2

Find a basis for, and the dimension of, the solution set of this system.

${\begin{array}{*{4}{rc}r}x_{1}&-&4x_{2}&+&3x_{3}&-&x_{4}&=&0\\2x_{1}&-&8x_{2}&+&6x_{3}&-&2x_{4}&=&0\end{array}}$

The solution set is

$\{{\begin{pmatrix}4x_{2}-3x_{3}+x_{4}\\x_{2}\\x_{3}\\x_{4}\end{pmatrix}}\,{\big |}\,x_{2},x_{3},x_{4}\in \mathbb {R} \}$

so a natural basis is this

$\langle {\begin{pmatrix}4\\1\\0\\0\end{pmatrix}},{\begin{pmatrix}-3\\0\\1\\0\end{pmatrix}},{\begin{pmatrix}1\\0\\0\\1\end{pmatrix}}\rangle$

(checking linear independence is easy). Thus the dimension is three.

This exercise is recommended for all readers.
Problem 3

Find a basis for, and the dimension of, ${\mathcal {M}}_{2\!\times \!2}$ , the vector space of $2\!\times \!2$  matrices.

For this space

$\{{\begin{pmatrix}a&b\\c&d\end{pmatrix}}\,{\big |}\,a,b,c,d\in \mathbb {R} \}=\{a\cdot {\begin{pmatrix}1&0\\0&0\end{pmatrix}}+\dots +d\cdot {\begin{pmatrix}0&0\\0&1\end{pmatrix}}\,{\big |}\,a,b,c,d\in \mathbb {R} \}$

this is a natural basis.

$\langle {\begin{pmatrix}1&0\\0&0\end{pmatrix}},{\begin{pmatrix}0&1\\0&0\end{pmatrix}},{\begin{pmatrix}0&0\\1&0\end{pmatrix}},{\begin{pmatrix}0&0\\0&1\end{pmatrix}}\rangle$

The dimension is four.

Problem 4

Find the dimension of the vector space of matrices

${\begin{pmatrix}a&b\\c&d\end{pmatrix}}$

subject to each condition.

1. $a,b,c,d\in \mathbb {R}$
2. $a-b+2c=0$  and $d\in \mathbb {R}$
3. $a+b+c=0$ , $a+b-c=0$ , and $d\in \mathbb {R}$
1. As in the prior exercise, the space ${\mathcal {M}}_{2\!\times \!2}$  of matrices without restriction has this basis
$\langle {\begin{pmatrix}1&0\\0&0\end{pmatrix}},{\begin{pmatrix}0&1\\0&0\end{pmatrix}},{\begin{pmatrix}0&0\\1&0\end{pmatrix}},{\begin{pmatrix}0&0\\0&1\end{pmatrix}}\rangle$
and so the dimension is four.
2. For this space
$\{{\begin{pmatrix}a&b\\c&d\end{pmatrix}}\,{\big |}\,a=b-2c{\text{ and }}d\in \mathbb {R} \}=\{b\cdot {\begin{pmatrix}1&1\\0&0\end{pmatrix}}+c\cdot {\begin{pmatrix}-2&0\\1&0\end{pmatrix}}+d\cdot {\begin{pmatrix}0&0\\0&1\end{pmatrix}}\,{\big |}\,b,c,d\in \mathbb {R} \}$
this is a natural basis.
$\langle {\begin{pmatrix}1&1\\0&0\end{pmatrix}},{\begin{pmatrix}-2&0\\1&0\end{pmatrix}},{\begin{pmatrix}0&0\\0&1\end{pmatrix}}\rangle$
The dimension is three.
3. Gauss' method applied to the two-equation linear system gives that $c=0$  and that $a=-b$ . Thus, we have this description
$\{{\begin{pmatrix}-b&b\\0&d\end{pmatrix}}\,{\big |}\,b,d\in \mathbb {R} \}=\{b\cdot {\begin{pmatrix}-1&1\\0&0\end{pmatrix}}+d\cdot {\begin{pmatrix}0&0\\0&1\end{pmatrix}}\,{\big |}\,b,d\in \mathbb {R} \}$
and so this is a natural basis.
$\langle {\begin{pmatrix}-1&1\\0&0\end{pmatrix}},{\begin{pmatrix}0&0\\0&1\end{pmatrix}}\rangle$
The dimension is two.
This exercise is recommended for all readers.
Problem 5

Find the dimension of each.

1. The space of cubic polynomials $p(x)$  such that $p(7)=0$
2. The space of cubic polynomials $p(x)$  such that $p(7)=0$  and $p(5)=0$
3. The space of cubic polynomials $p(x)$  such that $p(7)=0$ , $p(5)=0$ , and $p(3)=0$
4. The space of cubic polynomials $p(x)$  such that $p(7)=0$ , $p(5)=0$ , $p(3)=0$ , and $p(1)=0$

The bases for these spaces are developed in the answer set of the prior subsection.

1. One basis is $\langle -7+x,-49+x^{2},-343+x^{3}\rangle$ . The dimension is three.
2. One basis is $\langle 35-12x+x^{2},420-109x+x^{3}\rangle$  so the dimension is two.
3. A basis is $\{-105+71x-15x^{2}+x^{3}\}$ . The dimension is one.
4. This is the trivial subspace of ${\mathcal {P}}_{3}$  and so the basis is empty. The dimension is zero.
Problem 6

What is the dimension of the span of the set $\{\cos ^{2}\theta ,\sin ^{2}\theta ,\cos 2\theta ,\sin 2\theta \}$ ? This span is a subspace of the space of all real-valued functions of one real variable.

First recall that $\cos 2\theta =\cos ^{2}\theta -\sin ^{2}\theta$ , and so deletion of $\cos 2\theta$  from this set leaves the span unchanged. What's left, the set $\{\cos ^{2}\theta ,\sin ^{2}\theta ,\sin 2\theta \}$ , is linearly independent (consider the relationship $c_{1}\cos ^{2}\theta +c_{2}\sin ^{2}\theta +c_{3}\sin 2\theta =Z(\theta )$  where $Z$  is the zero function, and then take $\theta =0$ , $\theta =\pi /4$ , and $\theta =\pi /2$  to conclude that each $c$  is zero). It is therefore a basis for its span. That shows that the span is a dimension three vector space.

Problem 7

Find the dimension of $\mathbb {C} ^{47}$ , the vector space of $47$ -tuples of complex numbers.

Here is a basis

$\langle (1+0i,0+0i,\dots ,0+0i),\,(0+1i,0+0i,\dots ,0+0i),(0+0i,1+0i,\dots ,0+0i),\ldots \rangle$

and so the dimension is $2\cdot 47=94$ .

Problem 8

What is the dimension of the vector space ${\mathcal {M}}_{3\!\times \!5}$  of $3\!\times \!5$  matrices?

A basis is

$\langle {\begin{pmatrix}1&0&0&0&0\\0&0&0&0&0\\0&0&0&0&0\end{pmatrix}},{\begin{pmatrix}0&1&0&0&0\\0&0&0&0&0\\0&0&0&0&0\end{pmatrix}},\dots ,{\begin{pmatrix}0&0&0&0&0\\0&0&0&0&0\\0&0&0&0&1\end{pmatrix}}\rangle$

and thus the dimension is $3\cdot 5=15$ .

This exercise is recommended for all readers.
Problem 9

Show that this is a basis for $\mathbb {R} ^{4}$ .

$\langle {\begin{pmatrix}1\\0\\0\\0\end{pmatrix}},{\begin{pmatrix}1\\1\\0\\0\end{pmatrix}},{\begin{pmatrix}1\\1\\1\\0\end{pmatrix}},{\begin{pmatrix}1\\1\\1\\1\end{pmatrix}}\rangle$

(The results of this subsection can be used to simplify this job.)

In a four-dimensional space a set of four vectors is linearly independent if and only if it spans the space. The form of these vectors makes linear independence easy to show (look at the equation of fourth components, then at the equation of third components, etc.).

Problem 10

Refer to Example 2.9.

1. Sketch a similar subspace diagram for ${\mathcal {P}}_{2}$ .
2. Sketch one for ${\mathcal {M}}_{2\!\times \!2}$ .
1. The diagram for ${\mathcal {P}}_{2}$  has four levels. The top level has the only three-dimensional subspace, ${\mathcal {P}}_{2}$  itself. The next level contains the two-dimensional subspaces (not just the linear polynomials; any two-dimensional subspace, like those polynomials of the form $ax^{2}+b$ ). Below that are the one-dimensional subspaces. Finally, of course, is the only zero-dimensional subspace, the trivial subspace.
2. For ${\mathcal {M}}_{2\!\times \!2}$ , the diagram has five levels, including subspaces of dimension four through zero.
This exercise is recommended for all readers.
Problem 11
Where $S$  is a set, the functions $f:S\to \mathbb {R}$  form a vector space under the natural operations: the sum $f+g$  is the function given by $f+g\,(s)=f(s)+g(s)$  and the scalar product is given by $r\cdot f\,(s)=r\cdot f(s)$ . What is the dimension of the space resulting for each domain?
1. $S=\{1\}$
2. $S=\{1,2\}$
3. $S=\{1,\ldots ,n\}$
1. One
2. Two
3. $n$
Problem 12

(See Problem 11.) Prove that this is an infinite-dimensional space: the set of all functions $f:\mathbb {R} \to \mathbb {R}$  under the natural operations.

We need only produce an infinite linearly independent set. One is $\langle f_{1},f_{2},\ldots \rangle$  where $f_{i}:\mathbb {R} \to \mathbb {R}$  is

$f_{i}(x)={\begin{cases}1&{\text{if }}x=i\\0&{\text{otherwise}}\end{cases}}$

the function that has value $1$  only at $x=i$ .

Problem 13

(See Problem 11.) What is the dimension of the vector space of functions $f:S\to \mathbb {R}$ , under the natural operations, where the domain $S$  is the empty set?

Considering a function to be a set, specifically, a set of ordered pairs $(x,f(x))$ , then the only function with an empty domain is the empty set. Thus this is a trivial vector space, and has dimension zero.

Problem 14

Show that any set of four vectors in $\mathbb {R} ^{2}$  is linearly dependent.

Apply Corollary 2.8.

Problem 15

Show that the set $\langle {\vec {\alpha }}_{1},{\vec {\alpha }}_{2},{\vec {\alpha }}_{3}\rangle \subset \mathbb {R} ^{3}$  is a basis if and only if there is no plane through the origin containing all three vectors.

A plane has the form $\{{\vec {p}}+t_{1}{\vec {v}}_{1}+t_{2}{\vec {v}}_{2}\,{\big |}\,t_{1},t_{2}\in \mathbb {R} \}$ . (The first chapter also calls this a "$2$ -flat", and contains a discussion of why this is equivalent to the description often taken in Calculus as the set of points $(x,y,z)$  subject to a condition of the form $ax+by+cz=d$ ). When the plane passes through the origin we can take the particular vector ${\vec {p}}$  to be ${\vec {0}}$ . Thus, in the language we have developed in this chapter, a plane through the origin is the span of a set of two vectors.

Now for the statement. Asserting that the three are not coplanar is the same as asserting that no vector lies in the span of the other two— no vector is a linear combination of the other two. That's simply an assertion that the three-element set is linearly independent. By Corollary 2.12, that's equivalent to an assertion that the set is a basis for $\mathbb {R} ^{3}$ .

Problem 16
1. Prove that any subspace of a finite dimensional space has a basis.
2. Prove that any subspace of a finite dimensional space is finite dimensional.

Let the space $V$  be finite dimensional. Let $S$  be a subspace of $V$ .

1. The empty set is a linearly independent subset of $S$ . By Corollary 2.10, it can be expanded to a basis for the vector space $S$ .
2. Any basis for the subspace $S$  is a linearly independent set in the superspace $V$ . Hence it can be expanded to a basis for the superspace, which is finite dimensional. Therefore it has only finitely many members.
Problem 17

Where is the finiteness of $B$  used in Theorem 2.3?

It ensures that we exhaust the ${\vec {\beta }}$ 's. That is, it justifies the first sentence of the last paragraph.

This exercise is recommended for all readers.
Problem 18

Prove that if $U$  and $W$  are both three-dimensional subspaces of $\mathbb {R} ^{5}$  then $U\cap W$  is non-trivial. Generalize.

Let $B_{U}$  be a basis for $U$  and let $B_{W}$  be a basis for $W$ . The set $B_{U}\cup B_{W}$  is linearly dependent as it is a six member subset of the five-dimensional space $\mathbb {R} ^{5}$ . Thus some member of $B_{W}$  is in the span of $B_{U}$ , and thus $U\cap W$  is more than just the trivial space $\{{\vec {0}}\,\}$ .

Generalization: if $U,W$  are subspaces of a vector space of dimension $n$  and if $\dim(U)+\dim(W)>n$  then they have a nontrivial intersection.

Problem 19

Because a basis for a space is a subset of that space, we are naturally led to how the property "is a basis" interacts with set operations.

1. Consider first how bases might be related by "subset". Assume that $U,W$  are subspaces of some vector space and that $U\subseteq W$ . Can there exist bases $B_{U}$  for $U$  and $B_{W}$  for $W$  such that $B_{U}\subseteq B_{W}$ ? Must such bases exist? For any basis $B_{U}$  for $U$ , must there be a basis $B_{W}$  for $W$  such that $B_{U}\subseteq B_{W}$ ? For any basis $B_{W}$  for $W$ , must there be a basis $B_{U}$  for $U$  such that $B_{U}\subseteq B_{W}$ ? For any bases $B_{U},B_{W}$  for $U$  and $W$ , must $B_{U}$  be a subset of $B_{W}$ ?
2. Is the intersection of bases a basis? For what space?
3. Is the union of bases a basis? For what space?

(Hint. Test any conjectures against some subspaces of $\mathbb {R} ^{3}$ .)

First, note that a set is a basis for some space if and only if it is linearly independent, because in that case it is a basis for its own span.

1. The answer to the question in the second paragraph is "yes" (implying "yes" answers for both questions in the first paragraph). If $B_{U}$  is a basis for $U$  then $B_{U}$  is a linearly independent subset of $W$ . Apply Corollary 2.10 to expand it to a basis for $W$ . That is the desired $B_{W}$ . The answer to the question in the third paragraph is "no", which implies a "no" answer to the question of the fourth paragraph. Here is an example of a basis for a superspace with no sub-basis forming a basis for a subspace: in $W=\mathbb {R} ^{2}$ , consider the standard basis ${\mathcal {E}}_{2}$ . No sub-basis of ${\mathcal {E}}_{2}$  forms a basis for the subspace $U$  of $\mathbb {R} ^{2}$  that is the line $y=x$ .
2. It is a basis (for its span) because the intersection of linearly independent sets is linearly independent (the intersection is a subset of each of the linearly independent sets). It is not, however, a basis for the intersection of the spaces. For instance, these are bases for $\mathbb {R} ^{2}$ :
$B_{1}=\langle {\begin{pmatrix}1\\0\end{pmatrix}},{\begin{pmatrix}0\\1\end{pmatrix}}\rangle \quad {\text{and}}\quad B_{2}=\langle {\begin{pmatrix}2\\0\end{pmatrix}},{\begin{pmatrix}0\\2\end{pmatrix}}\rangle$
and $\mathbb {R} ^{2}\cap \mathbb {R} ^{2}=\mathbb {R} ^{2}$ , but $B_{1}\cap B_{2}$  is empty. All we can say is that the intersection of the bases is a basis for a subset of the intersection of the spaces.
3. The union of bases need not be a basis: in $\mathbb {R} ^{2}$
$B_{1}=\langle {\begin{pmatrix}1\\0\end{pmatrix}},{\begin{pmatrix}1\\1\end{pmatrix}}\rangle \quad {\text{and}}\quad B_{2}=\langle {\begin{pmatrix}1\\0\end{pmatrix}},{\begin{pmatrix}0\\2\end{pmatrix}}\rangle$
have a union $B_{1}\cup B_{2}$  that is not linearly independent. A necessary and sufficient condition for a union of two bases to be a basis
$B_{1}\cup B_{2}{\text{ is linearly independent }}\quad \iff \quad [B_{1}\cap B_{2}]=[B_{1}]\cap [B_{2}]$
it is easy enough to prove (but perhaps hard to apply).
4. The complement of a basis cannot be a basis because it contains the zero vector.
This exercise is recommended for all readers.
Problem 20

Consider how "dimension" interacts with "subset". Assume $U$  and $W$  are both subspaces of some vector space, and that $U\subseteq W$ .

1. Prove that $\dim(U)\leq \dim(W)$ .
2. Prove that equality of dimension holds if and only if $U=W$ .
3. Show that the prior item does not hold if they are infinite-dimensional.
1. A basis for $U$  is a linearly independent set in $W$  and so can be expanded via Corollary 2.10 to a basis for $W$ . The second basis has at least as many members as the first.
2. One direction is clear: if $V=W$  then they have the same dimension. For the converse, let $B_{U}$  be a basis for $U$ . It is a linearly independent subset of $W$  and so can be expanded to a basis for $W$ . If $\dim(U)=\dim(W)$  then this basis for $W$  has no more members than does $B_{U}$  and so equals $B_{U}$ . Since $U$  and $W$  have the same bases, they are equal.
3. Let $W$  be the space of finite-degree polynomials and let $U$  be the subspace of polynomials that have only even-powered terms $\{a_{0}+a_{1}x^{2}+a_{2}x^{4}+\dots +a_{n}x^{2n}\,{\big |}\,a_{0},\ldots ,a_{n}\in \mathbb {R} \}$ . Both spaces have infinite dimension, but $U$  is a proper subspace.
? Problem 21

For any vector ${\vec {v}}$  in $\mathbb {R} ^{n}$  and any permutation $\sigma$  of the numbers $1$ , $2$ , ..., $n$  (that is, $\sigma$  is a rearrangement of those numbers into a new order), define $\sigma ({\vec {v}})$  to be the vector whose components are $v_{\sigma (1)}$ , $v_{\sigma (2)}$ , ..., and $v_{\sigma (n)}$  (where $\sigma (1)$  is the first number in the rearrangement, etc.). Now fix ${\vec {v}}$  and let $V$  be the span of $\{\sigma ({\vec {v}})\,{\big |}\,\sigma {\text{ permutes }}1,\ldots ,n\}$ . What are the possibilities for the dimension of $V$ ? (Gilbert, Krusemeyer & Larson 1993, Problem 47)

The possibilities for the dimension of $V$  are $0$ , $1$ , $n-1$ , and $n$ .

To see this, first consider the case when all the coordinates of ${\vec {v}}$  are equal.

${\vec {v}}={\begin{pmatrix}z\\z\\\vdots \\z\end{pmatrix}}$

Then $\sigma ({\vec {v}})={\vec {v}}$  for every permutation $\sigma$ , so $V$  is just the span of ${\vec {v}}$ , which has dimension $0$  or $1$  according to whether ${\vec {v}}$  is ${\vec {0}}$  or not.

Now suppose not all the coordinates of ${\vec {v}}$  are equal; let $x$  and $y$  with $x\neq y$  be among the coordinates of ${\vec {v}}$ . Then we can find permutations $\sigma _{1}$  and $\sigma _{2}$  such that

$\sigma _{1}({\vec {v}})={\begin{pmatrix}x\\y\\a_{3}\\\vdots \\a_{n}\end{pmatrix}}\quad {\text{and}}\quad \sigma _{2}({\vec {v}})={\begin{pmatrix}y\\x\\a_{3}\\\vdots \\a_{n}\end{pmatrix}}$

for some $a_{3},\ldots ,a_{n}\in \mathbb {R}$ . Therefore,

${\frac {1}{y-x}}{\bigl (}\sigma _{1}({\vec {v}})-\sigma _{2}({\vec {v}}){\bigr )}={\begin{pmatrix}-1\\1\\0\\\vdots \\0\end{pmatrix}}$

is in $V$ . That is, ${\vec {e}}_{2}-{\vec {e}}_{1}\in V$ , where ${\vec {e}}_{1}$ , ${\vec {e}}_{2}$ , ..., ${\vec {e}}_{n}$  is the standard basis for $\mathbb {R} ^{n}$ . Similarly, ${\vec {e}}_{3}-{\vec {e}}_{2}$ , ..., ${\vec {e}}_{n}-{\vec {e}}_{1}$  are all in $V$ . It is easy to see that the vectors ${\vec {e}}_{2}-{\vec {e}}_{1}$ , ${\vec {e}}_{3}-{\vec {e}}_{2}$ , ..., ${\vec {e}}_{n}-{\vec {e}}_{1}$  are linearly independent (that is, form a linearly independent set), so $\dim V\geq n-1$ .

Finally, we can write

${\begin{array}{rl}{\vec {v}}&=x_{1}{\vec {e}}_{1}+x_{2}{\vec {e}}_{2}+\dots +x_{n}{\vec {e}}_{n}\\&=(x_{1}+x_{2}+\dots +x_{n}){\vec {e}}_{1}+x_{2}({\vec {e}}_{2}-{\vec {e}}_{1})+\dots +x_{n}({\vec {e}}_{n}-{\vec {e}}_{1})\end{array}}$

This shows that if $x_{1}+x_{2}+\dots +x_{n}=0$  then ${\vec {v}}$  is in the span of ${\vec {e}}_{2}-{\vec {e}}_{1}$ , ..., ${\vec {e_{n}}}-{\vec {e}}_{1}$  (that is, is in the span of the set of those vectors); similarly, each $\sigma ({\vec {v}})$  will be in this span, so $V$  will equal this span and $\dim V=n-1$ . On the other hand, if $x_{1}+x_{2}+\cdots +x_{n}\neq 0$  then the above equation shows that ${\vec {e}}_{1}\in V$  and thus ${\vec {e}}_{1},\dots ,{\vec {e}}_{n}\in V$ , so $V=\mathbb {R} ^{n}$  and $\dim V=n$ .