# Linear Algebra/Inner product spaces

Recall that in your study of vectors, we looked at an operation known as the dot product, and that if we have two vectors in Rn, we simply multiply the components together and sum them up. With the dot product, it becomes possible to introduce important new ideas like length and angle. The length of a vector,${\displaystyle \mathbf {a} }$, is just ${\displaystyle ||\mathbf {a} ||={\sqrt {\mathbf {a} \cdot \mathbf {a} }}}$. The angle between two vectors,${\displaystyle \mathbf {a} }$ and ${\displaystyle \mathbf {b} }$, is related to the dot product by

${\displaystyle \cos {\theta }={\frac {\mathbf {a} \cdot \mathbf {b} }{||\mathbf {a} ||||\mathbf {b} ||}}}$

It turns out that only a few properties of the dot product are necessary to define similar ideas in vector spaces other than Rn, such as the spaces of ${\displaystyle m\times n}$ matrices, or polynomials. The more general operation that will take the place of the dot product in these other spaces is called the "inner product".

## The inner product

Say we have two vectors:

${\displaystyle \mathbf {a} ={\begin{pmatrix}2\\1\\4\end{pmatrix}},\mathbf {b} ={\begin{pmatrix}6\\3\\0\end{pmatrix}}}$

If we want to take their dot product, we would work as follows

${\displaystyle \mathbf {a} \cdot \mathbf {b} =a_{1}b_{1}+a_{2}b_{2}+a_{3}b_{3}=(2)(6)+(1)(3)+(4)(0)=15}$

Because in this case multiplication is commutative, we then have a·b = b · a.

But then, we observe that

${\displaystyle \mathbf {v} \cdot (\alpha \mathbf {a} +\beta \mathbf {b} )=\alpha \mathbf {v} \cdot \mathbf {a} +\beta \mathbf {v} \cdot \mathbf {b} }$

much like the regular algebraic equality v(aA+bB)=avA+bvB. For regular dot products this is true since, for R3, for example, one can expand both sides out to obtain

${\displaystyle {\begin{matrix}(\alpha v_{1}a_{1}+\beta v_{1}b_{1})+(\alpha v_{2}a_{2}+\beta v_{2}b_{2})+(\alpha v_{3}a_{3}+\beta v_{3}b_{3})=\\(\alpha v_{1}a_{1}+\alpha v_{2}a_{2}+\alpha v_{3}a_{3})+(\beta v_{1}b_{1}+\beta v_{2}b_{2}+\beta v_{3}b_{3})\end{matrix}}}$

Finally, we can notice that v·v is always positive or greater than zero - checking this for R3 gives this as

${\displaystyle \mathbf {v} \cdot \mathbf {v} =v_{1}^{2}+v_{2}^{2}+v_{3}^{2}}$

which can never be less than zero since a real number squared is positive. Note that v·v = 0 if and only if v = 0.

In generalizing this sort of behaviour, we want to keep these three behaviours. We can then move on to a definition of a generalization of the dot product, which we call the inner product. An inner product of two vectors in some vector space V, written < x, y > is a function that maps V×V to R, which obeys the property that

• < x, y > = < y, x >
• < v, αab > = α < v, a > + β < v, b >
• < a, a > ≥ 0, < a, a > = 0 iff a = 0.

The vector space V and some inner product together are known as an inner product space.

## The dot product in ${\displaystyle \mathbb {C} ^{n}}$

Given two vectors ${\displaystyle \mathbf {a} =a_{1}{\vec {e}}_{1}+a_{2}{\vec {e}}_{2}+\dots +a_{n}{\vec {e}}_{n}\in \mathbb {C} ^{n}}$  and ${\displaystyle \mathbf {b} =b_{1}{\vec {e}}_{1}+b_{2}{\vec {e}}_{2}+\dots +b_{n}{\vec {e}}_{n}\in \mathbb {C} ^{n}}$ , the dot product generalized to complex numbers is:

${\displaystyle \mathbf {a} \cdot \mathbf {b} =\sum _{i=1}^{n}a_{i}^{*}b_{i}=a_{1}^{*}b_{1}+a_{2}^{*}b_{2}+\dots +a_{n}^{*}b_{n}}$

where ${\displaystyle z^{*}}$  for an arbitrary complex number ${\displaystyle z=c+di}$  is the complex conjugate: ${\displaystyle z^{*}=c-di}$ .

The dot product is "conjugate commutative": ${\displaystyle \mathbf {a} \cdot \mathbf {b} =(\mathbf {b} \cdot \mathbf {a} )^{*}}$ . One immediate consequence of the definition of the dot product is that the dot product of a vector with itself is always a non-negative real number: ${\displaystyle \mathbf {a} \cdot \mathbf {a} \geq 0}$ .

${\displaystyle \mathbf {a} \cdot \mathbf {a} =0}$  if and only if ${\displaystyle \mathbf {a} ={\vec {0}}}$

### The Cauchy-Schwarz Inequality for ${\displaystyle \mathbb {C} ^{n}}$

Cauchy-Schwarz Inequality

Given two vectors ${\displaystyle \mathbf {a} ,\mathbf {b} \in \mathbb {C} ^{n}}$ , it is the case that ${\displaystyle |\mathbf {a} \cdot \mathbf {b} |\leq |\mathbf {a} ||\mathbf {b} |}$

In ${\displaystyle \mathbb {R} ^{n}}$ , the Cauchy-Schwarz inequality can be proven from the triangle inequality. Here, the Cauchy-Schwarz inequality will be proven algebraically.

To make the proof more intuitive, the algebraic proof for ${\displaystyle \mathbf {a} ,\mathbf {b} \in \mathbb {R} ^{n}}$  will be given first.

Proof for ${\displaystyle \mathbf {a} ,\mathbf {b} \in \mathbb {R} ^{n}}$

${\displaystyle |\mathbf {a} \cdot \mathbf {b} |\leq |\mathbf {a} ||\mathbf {b} |}$  follows from ${\displaystyle |\mathbf {a} \cdot \mathbf {b} |^{2}\leq |\mathbf {a} |^{2}|\mathbf {b} |^{2}}$  which is equivalent to

${\displaystyle \left(\sum _{i=1}^{n}a_{i}b_{i}\right)^{2}\leq \left(\sum _{i=1}^{n}a_{i}^{2}\right)\left(\sum _{j=1}^{n}b_{j}^{2}\right)}$

expanding both sides gives:

${\displaystyle \sum _{i=1}^{n}\sum _{j=1}^{n}a_{i}b_{i}a_{j}b_{j}\leq \sum _{i=1}^{n}\sum _{j=1}^{n}a_{i}^{2}b_{j}^{2}}$

${\displaystyle \iff \sum _{i=1}^{n}\sum _{j=1}^{n}(a_{i}b_{j})(a_{j}b_{i})\leq \sum _{i=1}^{n}\sum _{j=1}^{n}(a_{i}b_{j})^{2}}$

"Folding" the double sums along the diagonal, and cancelling out the diagonal terms which are equivalent on both sides, gives:

${\displaystyle \iff \sum _{i=2}^{n}\sum _{j=1}^{i-1}2(a_{i}b_{j})(a_{j}b_{i})\leq \sum _{i=2}^{n}\sum _{j=1}^{i-1}((a_{i}b_{j})^{2}+(a_{j}b_{i})^{2})}$

${\displaystyle \iff 0\leq \sum _{i=2}^{n}\sum _{j=1}^{i-1}(a_{i}b_{j}-a_{j}b_{i})^{2}}$

The above inequality is clearly true, therefore the Cauchy-Schwarz inequality holds for ${\displaystyle \mathbf {a} ,\mathbf {b} \in \mathbb {R} ^{n}}$ .

Proof for ${\displaystyle \mathbf {a} ,\mathbf {b} \in \mathbb {C} ^{n}}$

Note that ${\displaystyle |\mathbf {a} \cdot \mathbf {b} |\leq |\mathbf {a} ||\mathbf {b} |}$  follows from ${\displaystyle |\mathbf {a} \cdot \mathbf {b} |^{2}\leq |\mathbf {a} |^{2}|\mathbf {b} |^{2}}$  which is equivalent to ${\displaystyle (\mathbf {a} \cdot \mathbf {b} )^{*}(\mathbf {a} \cdot \mathbf {b} )\leq (\mathbf {a} \cdot \mathbf {a} )(\mathbf {b} \cdot \mathbf {b} )}$ . Expanding both sides yields:

${\displaystyle \left(\sum _{i=1}^{n}a_{i}^{*}b_{i}\right)^{*}\left(\sum _{i=1}^{n}a_{i}^{*}b_{i}\right)\leq \left(\sum _{i=1}^{n}|a_{i}|^{2}\right)\left(\sum _{i=1}^{n}|b_{i}|^{2}\right)}$

${\displaystyle \iff \sum _{i=1}^{n}\sum _{j=1}^{n}(a_{i}^{*}b_{i})^{*}(a_{j}^{*}b_{j})\leq \sum _{i=1}^{n}\sum _{j=1}^{n}|a_{i}|^{2}|b_{j}|^{2}}$

${\displaystyle \iff \sum _{i=1}^{n}\sum _{j=1}^{n}(a_{j}b_{i})^{*}(a_{i}b_{j})\leq \sum _{i=1}^{n}\sum _{j=1}^{n}|a_{i}b_{j}|^{2}}$

"Folding" the double sums along the diagonal, and cancelling out the diagonal terms which are equivalent on both sides, gives:

${\displaystyle \iff \sum _{i=2}^{n}\sum _{j=1}^{i-1}((a_{j}b_{i})^{*}(a_{i}b_{j})+(a_{i}b_{j})^{*}(a_{j}b_{i}))\leq \sum _{i=2}^{n}\sum _{j=1}^{i-1}(|a_{i}b_{j}|^{2}+|a_{j}b_{i}|^{2})}$

${\displaystyle \iff \sum _{i=2}^{n}\sum _{j=1}^{i-1}2\Re ((a_{i}b_{j})^{*}(a_{j}b_{i}))\leq \sum _{i=2}^{n}\sum _{j=1}^{i-1}(|a_{i}b_{j}|^{2}+|a_{j}b_{i}|^{2})}$

Given complex numbers ${\displaystyle z}$  and ${\displaystyle w}$ , it can be proven that ${\displaystyle 2\Re (z^{*}w)\leq |z|^{2}+|w|^{2}}$  (this is similar to ${\displaystyle 2xy\leq x^{2}+y^{2}}$  for real numbers). The above inequality holds, and therefore the Cauchy-Schwarz inequality holds for complex numbers.