Prove that matrix addition represents addition of linear maps.
Prove that matrix scalar multiplication represents scalar
multiplication of linear maps.
Answer
Represent the domain vector and the maps
with respect to bases in the usual way.
The representation of
regroups
to the entry-by-entry
sum of the representation of and the representation of
.
The representation of
is the entry-by-entry multiple of and the representation of .
This exercise is recommended for all readers.
Problem 3
Prove each, where the operations are defined, where , , and
are matrices, where
is the zero matrix, and where and are scalars.
Matrix addition is commutative .
Matrix addition is associative .
The zero matrix is an additive identity .
Matrices have an additive inverse .
Answer
First, each of these properties
is easy to check in an entry-by-entry way.
For example, writing
then, by definition we have
and the two are equal since their entries are equal
.
That is, each of these is easy to check by using
Definition 1.3 alone.
However, each property
is also easy to understand in terms of the represented
maps, by applying Theorem 1.5 as well as
the definition.
The two maps and are equal because
, as addition is
commutative in any vector space.
Because the maps are the same, they must have the same
representative.
As with the prior answer, except that here we apply that
vector space addition is associative.
As before, except that here we note that
.
Apply that
.
Apply that
.
Apply the prior two items with and .
Apply that
.
Apply that
.
Problem 4
Fix domain and codomain spaces.
In general, one
matrix can represent many different maps with respect to different bases.
However, prove that a zero matrix represents only a zero map.
Are there other such matrices?
Answer
For any with bases ,
the (appropriately-sized) zero matrix represents this map.
This is the zero map.
There are no other matrices that represent only one map.
For, suppose that is
not the zero matrix.
Then it has a nonzero entry; assume that .
With respect to bases , it represents
sending
and with respect to it also represents
sending
(the notation means to double all of the members of D). These maps are easily seen to be unequal.
This exercise is recommended for all readers.
Problem 5
Let and be vector spaces of dimensions and .
Show that the space of linear maps from to
is isomorphic to .
Answer
Fix bases and for and , and consider associating each linear map with the matrix representing that map . From the prior section we know that (under fixed bases) the matrices correspond to linear maps, so the representation map is one-to-one and onto. That it preserves linear operations is Theorem 1.5.
This exercise is recommended for all readers.
Problem 6
Show that it follows from the prior questions that
for any six transformations
there are scalars such that
is the zero map.
(Hint: this is a bit of a misleading question.)
Answer
Fix bases and represent the transformations with matrices. The space of matrices has dimension four, and hence the above six-element set is linearly dependent. By the prior exercise that extends to a dependence of maps. (The misleading part is only that there are six transformations, not five, so that we have more than we need to give the existence of the dependence.)
Problem 7
The trace of a square matrix is the sum of the entries on the
main diagonal (the entry
plus the entry, etc.;
we will see the significance of the trace in Chapter Five).
Show that .
Is there a similar result for scalar multiplication?
Answer
That the trace of a sum is the sum of the traces holds because both and are the sum of with , etc. For scalar multiplication we have ; the proof is easy. Thus the trace map is a homomorphism from to .
Problem 8
Recall that the transpose
of a matrix is another matrix, whose entry is the
entry of .
Verifiy these identities.
Answer
The entry of is
.
That is also the entry of .
The entry of is
,
which is also the entry of .
This exercise is recommended for all readers.
Problem 9
A square matrix is symmetric if each entry equals
the entry, that is, if the matrix equals its transpose.
Prove that for any ,
the matrix is symmetric.
Does every symmetric matrix have this form?
Prove that the set of symmetric matrices is
a subspace of .
Answer
For , the entry
is and
the entry of is .
The two are equal and thus is symmetric.
Every symmetric matrix does have that form, since it can be written
.
The set of symmetric matrices is nonempty as it
contains the zero matrix.
Clearly a scalar multiple of a symmetric matrix is symmetric.
A sum of two symmetric matrices is
symmetric because (since
and ).
Thus the subset is nonempty and closed under the inherited
operations, and so it is a subspace.
This exercise is recommended for all readers.
Problem 10
How does matrix rank interact with
scalar multiplication— can
a scalar product of a rank matrix have rank less than ?
Greater?
How does matrix rank interact with matrix
addition— can a sum of
rank matrices have rank less than ?
Greater?
Answer
Scalar multiplication leaves the rank of a matrix unchanged
except that multiplication by zero leaves the matrix
with rank zero.
(This follows from the first theorem of the book, that multiplying a
row by a nonzero
scalar doesn't change the solution set of the associated
linear system.)
A sum of rank matrices can have rank
less than .
For instance,
for any matrix , the sum has rank zero.
A sum of rank matrices can have rank greater than .
Here are rank one matrices that sum to a rank two matrix.