# Linear Algebra/OLD/Matrix Operations

## Zero Matrix

A zero matrix is a matrix with all its entries being zero. An example of a zero matrix is

${\displaystyle M={\begin{bmatrix}0&0&0\\0&0&0\\0&0&0\end{bmatrix}}}$

## Scalars

A scalar is a nonzero constant, which are used to scale the matrix.

If r is a scalar and A is matrix, then the scalar multiple rA is the matrix whose columns are r times the corresponding columns in A.

Here is an example,

${\displaystyle r=4\qquad A={\begin{bmatrix}2&&3\\7&&1\end{bmatrix}}}$
${\displaystyle rA={\begin{bmatrix}4*2&&4*3\\4*7&&4*1\end{bmatrix}}={\begin{bmatrix}8&&12\\28&&4\end{bmatrix}}}$

A scalar is indirectly used when we subtract two matrices, because -B can be defined as (-1)B. This means when we subtract the matrix B from the matrix A, A-B is the same as A+(-1)B.

Two matrices can only be added or subtracted if they have the same size. Matrix addition and subtraction are done entry-wise, which means that each entry in A+B is the sum of the corresponding entries in A and B.

Here is an example of matrix addition

${\displaystyle A={\begin{bmatrix}7&&5&&3\\4&&0&&5\end{bmatrix}}\qquad B={\begin{bmatrix}1&&1&&1\\-1&&3&&2\end{bmatrix}}}$
${\displaystyle A+B={\begin{bmatrix}7+1&&5+1&&3+1\\4-1&&0+3&&5+2\end{bmatrix}}={\begin{bmatrix}8&&6&&4\\3&&3&&7\end{bmatrix}}}$

And an example of subtraction

${\displaystyle A={\begin{bmatrix}7&&5&&3\\4&&0&&5\end{bmatrix}}\qquad B={\begin{bmatrix}1&&1&&1\\-1&&3&&2\end{bmatrix}}}$
${\displaystyle A-B={\begin{bmatrix}7-1&&5-1&&3-1\\4+1&&0-3&&5-2\end{bmatrix}}={\begin{bmatrix}6&&4&&2\\5&&-3&&4\end{bmatrix}}}$

Remember you can not add or subtract two matrices of different sizes.

The following rules applies to sums and scalar multiples of matrices.
Let A, B, and C be matrices of the same size, and let r and s be scalars.

• A + B = B + A
• (A + B) + C = A + (B + C)
• A + 0 = A
• r(A + B) = rA + rB
• (r + s)A = rA + sA
• r(sA) = (rs)A

## Matrix Multiplication

Matrix multiplication is slightly less intuitive for the beginning student of linear algebra than is scalar multiplication. It is, however, no more difficult.

Definition

If A is a 1 by m matrix and B is an m by 1 matrix, then the product AB is given as

${\displaystyle {\begin{bmatrix}a_{11}&a_{12}&\ldots &a_{1m}\\\end{bmatrix}}{\begin{bmatrix}b_{11}\\b_{21}\\\vdots \\b_{m1}\end{bmatrix}}={\begin{bmatrix}a_{11}b_{11}+a_{12}b_{21}+\cdots +a_{1m}b_{m1}\\\end{bmatrix}}}$

This summation of terms can be expressed as a Riemann sum

${\displaystyle {\begin{bmatrix}\sum _{k=1}^{m}a_{1m}b_{m1}\\\end{bmatrix}}_{(1,1)}}$

We can use this knowledge to determine if matrix multiplication can occur. For example, a ${\displaystyle 3\times 2}$  multiplied by a ${\displaystyle 2\times 3}$  matrix will yield a ${\displaystyle 3\times 3}$  matrix. If the number of columns of the first matrix is equal to the number of rows of the second matrix, multiplication can occur (as seen in the example mentioned).

Matrix multiplication is noncommutative, meaning a * b does not equal b * a. This is easy to see by looking at the example above.

## Powers

If A is an ${\displaystyle n\times n}$  matrix and if k is a positive integer, then ${\displaystyle A^{k}}$  denotes the product of k copies of A

${\displaystyle A^{k}={\begin{matrix}\underbrace {A\cdots A} \\k\end{matrix}}}$

If A is nonzero and if x is in ${\displaystyle \mathbb {R} ^{n}}$ , then ${\displaystyle A^{k}\mathbf {x} }$  is the result of left-multiplying x by A repeatedly k times. If k = 0, then ${\displaystyle A^{0}\mathbf {x} }$  should be x itself. Thus ${\displaystyle A^{0}}$  is interpreted as the identity matrix.

## Transpose

Given the ${\displaystyle m\times n}$  matrix A, the transpose of A is the ${\displaystyle n\times m}$ , denoted ${\displaystyle A^{T}}$ , whose columns are formed from the corresponding rows of A.

For example

${\displaystyle A={\begin{bmatrix}a&&b\\c&&d\end{bmatrix}}\qquad B={\begin{bmatrix}3&&5\\2&&7\\6&&9\\1&&0\\5&&2\end{bmatrix}}}$
${\displaystyle A^{T}={\begin{bmatrix}a&&c\\b&&d\end{bmatrix}}\qquad B^{T}={\begin{bmatrix}3&&2&&6&&1&&5\\5&&7&&9&&0&&2\end{bmatrix}}}$

The following rules applied when working with transposing

1. ${\displaystyle \left(A^{T}\right)^{T}=A\,}$
2. ${\displaystyle (A+B)^{T}=A^{T}+B^{T}\,}$
3. For any scalar r, ${\displaystyle (rA)^{T}=rA^{T}\,}$
4. ${\displaystyle (AB)^{T}=B^{T}A^{T}\,}$

The 4th rule can be generalize to products of more than two factors, as "The transpose of a product of matrices equals the product of their transposes in the reverse order." Meaning

${\displaystyle (a_{1},a_{2},a_{3},...,a_{n})^{T}=a_{n}^{T},...,a_{3}^{T},a_{2}^{T},a_{1}^{T}}$