# Linear Algebra/Sums and Scalar Products

 Linear Algebra ← Matrix Operations Sums and Scalar Products Matrix Multiplication →

Recall that for two maps ${\displaystyle f}$ and ${\displaystyle g}$ with the same domain and codomain, the map sum ${\displaystyle f+g}$ has this definition.

${\displaystyle {\vec {v}}\;{\stackrel {f+g}{\longmapsto }}\;f({\vec {v}})+g({\vec {v}})}$

The easiest way to see how the representations of the maps combine to represent the map sum is with an example.

Example 1.1

Suppose that ${\displaystyle f,g:\mathbb {R} ^{2}\to \mathbb {R} ^{3}}$ are represented with respect to the bases ${\displaystyle B}$ and ${\displaystyle D}$ by these matrices.

${\displaystyle F={\rm {Rep}}_{B,D}(f)={\begin{pmatrix}1&3\\2&0\\1&0\end{pmatrix}}_{B,D}\qquad G={\rm {Rep}}_{B,D}(g)={\begin{pmatrix}0&0\\-1&-2\\2&4\end{pmatrix}}_{B,D}}$

Then, for any ${\displaystyle {\vec {v}}\in V}$ represented with respect to ${\displaystyle B}$, computation of the representation of ${\displaystyle f({\vec {v}})+g({\vec {v}})}$

${\displaystyle {\begin{pmatrix}1&3\\2&0\\1&0\end{pmatrix}}{\begin{pmatrix}v_{1}\\v_{2}\end{pmatrix}}+{\begin{pmatrix}0&0\\-1&-2\\2&4\end{pmatrix}}{\begin{pmatrix}v_{1}\\v_{2}\end{pmatrix}}={\begin{pmatrix}1v_{1}+3v_{2}\\2v_{1}+0v_{2}\\1v_{1}+0v_{2}\end{pmatrix}}+{\begin{pmatrix}0v_{1}+0v_{2}\\-1v_{1}-2v_{2}\\2v_{1}+4v_{2}\end{pmatrix}}}$

gives this representation of ${\displaystyle f+g\,({\vec {v}})}$.

${\displaystyle {\begin{pmatrix}(1+0)v_{1}+(3+0)v_{2}\\(2-1)v_{1}+(0-2)v_{2}\\(1+2)v_{1}+(0+4)v_{2}\end{pmatrix}}={\begin{pmatrix}1v_{1}+3v_{2}\\1v_{1}-2v_{2}\\3v_{1}+4v_{2}\end{pmatrix}}}$

Thus, the action of ${\displaystyle f+g}$ is described by this matrix-vector product.

${\displaystyle {\begin{pmatrix}1&3\\1&-2\\3&4\end{pmatrix}}_{B,D}{\begin{pmatrix}v_{1}\\v_{2}\end{pmatrix}}_{B}={\begin{pmatrix}1v_{1}+3v_{2}\\1v_{1}-2v_{2}\\3v_{1}+4v_{2}\end{pmatrix}}_{D}}$

This matrix is the entry-by-entry sum of original matrices, e.g., the ${\displaystyle 1,1}$ entry of ${\displaystyle {\rm {Rep}}_{B,D}(f+g)}$ is the sum of the ${\displaystyle 1,1}$ entry of ${\displaystyle F}$ and the ${\displaystyle 1,1}$ entry of ${\displaystyle G}$.

Representing a scalar multiple of a map works the same way.

Example 1.2

If ${\displaystyle t}$ is a transformation represented by

${\displaystyle {\rm {Rep}}_{B,D}(t)={\begin{pmatrix}1&0\\1&1\end{pmatrix}}_{B,D}\quad {\text{so that}}\quad {\vec {v}}={\begin{pmatrix}v_{1}\\v_{2}\end{pmatrix}}_{B}\mapsto {\begin{pmatrix}v_{1}\\v_{1}+v_{2}\end{pmatrix}}_{D}=t({\vec {v}})}$

then the scalar multiple map ${\displaystyle 5t}$ acts in this way.

${\displaystyle {\vec {v}}={\begin{pmatrix}v_{1}\\v_{2}\end{pmatrix}}_{B}\;\longmapsto \;{\begin{pmatrix}5v_{1}\\5v_{1}+5v_{2}\end{pmatrix}}_{D}=5\cdot t({\vec {v}})}$

Therefore, this is the matrix representing ${\displaystyle 5t}$.

${\displaystyle {\rm {Rep}}_{B,D}(5t)={\begin{pmatrix}5&0\\5&5\end{pmatrix}}_{B,D}}$
Definition 1.3

The sum of two same-sized matrices is their entry-by-entry sum. The scalar multiple of a matrix is the result of entry-by-entry scalar multiplication.

Remark 1.4

These extend the vector addition and scalar multiplication operations that we defined in the first chapter.

Theorem 1.5

Let ${\displaystyle h,g:V\to W}$ be linear maps represented with respect to bases ${\displaystyle B,D}$ by the matrices ${\displaystyle H}$ and ${\displaystyle G}$, and let ${\displaystyle r}$ be a scalar. Then the map ${\displaystyle h+g:V\to W}$ is represented with respect to ${\displaystyle B,D}$ by ${\displaystyle H+G}$, and the map ${\displaystyle r\cdot h:V\to W}$ is represented with respect to ${\displaystyle B,D}$ by ${\displaystyle rH}$.

Proof

Problem 2; generalize the examples above.

A notable special case of scalar multiplication is multiplication by zero. For any map ${\displaystyle 0\cdot h}$ is the zero homomorphism and for any matrix ${\displaystyle 0\cdot H}$ is the zero matrix.

Example 1.6

The zero map from any three-dimensional space to any two-dimensional space is represented by the ${\displaystyle 2\!\times \!3}$ zero matrix

${\displaystyle Z={\begin{pmatrix}0&0&0\\0&0&0\end{pmatrix}}}$

no matter which domain and codomain bases are used.

## Exercises

This exercise is recommended for all readers.
Problem 1

Perform the indicated operations, if defined.

1. ${\displaystyle {\begin{pmatrix}5&-1&2\\6&1&1\end{pmatrix}}+{\begin{pmatrix}2&1&4\\3&0&5\end{pmatrix}}}$
2. ${\displaystyle 6\cdot {\begin{pmatrix}2&-1&-1\\1&2&3\end{pmatrix}}}$
3. ${\displaystyle {\begin{pmatrix}2&1\\0&3\end{pmatrix}}+{\begin{pmatrix}2&1\\0&3\end{pmatrix}}}$
4. ${\displaystyle 4{\begin{pmatrix}1&2\\3&-1\end{pmatrix}}+5{\begin{pmatrix}-1&4\\-2&1\end{pmatrix}}}$
5. ${\displaystyle 3{\begin{pmatrix}2&1\\3&0\end{pmatrix}}+2{\begin{pmatrix}1&1&4\\3&0&5\end{pmatrix}}}$
Problem 2

Prove Theorem 1.5.

2. Prove that matrix scalar multiplication represents scalar multiplication of linear maps.
This exercise is recommended for all readers.
Problem 3

Prove each, where the operations are defined, where ${\displaystyle G}$ , ${\displaystyle H}$ , and ${\displaystyle J}$  are matrices, where ${\displaystyle Z}$  is the zero matrix, and where ${\displaystyle r}$  and ${\displaystyle s}$  are scalars.

1. Matrix addition is commutative ${\displaystyle G+H=H+G}$ .
2. Matrix addition is associative ${\displaystyle G+(H+J)=(G+H)+J}$ .
3. The zero matrix is an additive identity ${\displaystyle G+Z=G}$ .
4. ${\displaystyle 0\cdot G=Z}$
5. ${\displaystyle (r+s)G=rG+sG}$
6. Matrices have an additive inverse ${\displaystyle G+(-1)\cdot G=Z}$ .
7. ${\displaystyle r(G+H)=rG+rH}$
8. ${\displaystyle (rs)G=r(sG)}$
Problem 4

Fix domain and codomain spaces. In general, one matrix can represent many different maps with respect to different bases. However, prove that a zero matrix represents only a zero map. Are there other such matrices?

This exercise is recommended for all readers.
Problem 5

Let ${\displaystyle V}$  and ${\displaystyle W}$  be vector spaces of dimensions ${\displaystyle n}$  and ${\displaystyle m}$ . Show that the space ${\displaystyle \mathop {\mathcal {L}} (V,W)}$  of linear maps from ${\displaystyle V}$  to ${\displaystyle W}$  is isomorphic to ${\displaystyle {\mathcal {M}}_{m\!\times \!n}}$ .

This exercise is recommended for all readers.
Problem 6

Show that it follows from the prior questions that for any six transformations ${\displaystyle t_{1},\dots ,t_{6}:\mathbb {R} ^{2}\to \mathbb {R} ^{2}}$  there are scalars ${\displaystyle c_{1},\dots ,c_{6}\in \mathbb {R} }$  such that ${\displaystyle c_{1}t_{1}+\dots +c_{6}t_{6}}$  is the zero map. (Hint: this is a bit of a misleading question.)

Problem 7

The trace of a square matrix is the sum of the entries on the main diagonal (the ${\displaystyle 1,1}$  entry plus the ${\displaystyle 2,2}$  entry, etc.; we will see the significance of the trace in Chapter Five). Show that ${\displaystyle {\mbox{trace}}(H+G)={\mbox{trace}}(H)+{\mbox{trace}}(G)}$ . Is there a similar result for scalar multiplication?

Problem 8

Recall that the transpose of a matrix ${\displaystyle M}$  is another matrix, whose ${\displaystyle i,j}$  entry is the ${\displaystyle j,i}$  entry of ${\displaystyle M}$ . Verifiy these identities.

1. ${\displaystyle {{(G+H)}^{\rm {trans}}}={{G}^{\rm {trans}}}+{{H}^{\rm {trans}}}}$
2. ${\displaystyle {{(r\cdot H)}^{\rm {trans}}}=r\cdot {{H}^{\rm {trans}}}}$
This exercise is recommended for all readers.
Problem 9

A square matrix is symmetric if each ${\displaystyle i,j}$  entry equals the ${\displaystyle j,i}$  entry, that is, if the matrix equals its transpose.

1. Prove that for any ${\displaystyle H}$ , the matrix ${\displaystyle H+{{H}^{\rm {trans}}}}$  is symmetric. Does every symmetric matrix have this form?
2. Prove that the set of ${\displaystyle n\!\times \!n}$  symmetric matrices is a subspace of ${\displaystyle {\mathcal {M}}_{n\!\times \!n}}$ .
This exercise is recommended for all readers.
Problem 10
1. How does matrix rank interact with scalar multiplication— can a scalar product of a rank ${\displaystyle n}$  matrix have rank less than ${\displaystyle n}$ ? Greater?
2. How does matrix rank interact with matrix addition— can a sum of rank ${\displaystyle n}$  matrices have rank less than ${\displaystyle n}$ ? Greater?
 Linear Algebra ← Matrix Operations Sums and Scalar Products Matrix Multiplication →