# Engineering Analysis/Matrices

## Derivatives

Consider the following set of linear equations:

${\displaystyle a=bx_{1}+cx_{2}}$
${\displaystyle d=ex_{1}+fx_{2}}$

We can define the matrix A to represent the coefficients, the vector B as the results, and the vector x as the variables:

${\displaystyle A={\begin{bmatrix}b&c\\e&f\end{bmatrix}}}$
${\displaystyle B={\begin{bmatrix}a\\d\end{bmatrix}}}$
${\displaystyle x={\begin{bmatrix}x_{1}\\x_{2}\end{bmatrix}}}$

And rewriting the equation in terms of the matrices, we get:

${\displaystyle B=Ax}$

Now, let's say we want the derivative of this equation with respect to the vector x:

${\displaystyle {\frac {d}{dx}}B={\frac {d}{dx}}Ax}$

We know that the first term is constant, so the derivative of the left-hand side of the equation is zero. Analyzing the right side shows us:

## Pseudo-Inverses

There are special matrices known as pseudo-inverses, that satisfies some of the properties of an inverse, but not others. To recap, If we have two square matrices A and B, that are both n × n, then if the following equation is true, we say that A is the inverse of B, and B is the inverse of A:

${\displaystyle AB=BA=I}$

### Right Pseudo-Inverse

Consider the following matrix:

${\displaystyle R=A^{T}[AA^{T}]^{-1}}$

We call this matrix R the right pseudo-inverse of A, because:

${\displaystyle AR=I}$

but

${\displaystyle RA\neq I}$

We will denote the right pseudo-inverse of A as ${\displaystyle A^{\dagger }}$

### Left Pseudo-Inverse

Consider the following matrix:

${\displaystyle L=[A^{T}A]^{-1}A^{T}}$

We call L the left pseudo-inverse of A because

${\displaystyle LA=I}$

but

${\displaystyle AL\neq I}$

We will denote the left pseudo-inverse of A as ${\displaystyle A^{\ddagger }}$