This subsection is optional. It requires material from the prior, also optional, subsection. The work done here will only be needed in the final two sections of Chapter Five.
The prior subsection suggests
that projecting onto the line spanned by
decomposes a vector into two parts
that are orthogonal and so are "not interacting".
We will now develop that suggestion.
Definition 2.1
Vectors are mutually orthogonal when any two are orthogonal: if then the dot product is zero.
Theorem 2.2
If the vectors in a set are mutually orthogonal and nonzero then that set is linearly independent.
Proof
Consider a linear relationship
.
If then taking the dot product of
with both sides of the equation
shows, since is nonzero, that is zero.
Corollary 2.3
If the vectors in a size subset of a dimensional space are mutually orthogonal and nonzero then that set is a basis for the space.
Proof
Any linearly independent size subset of a dimensional space is a basis.
Of course, the converse of Corollary 2.3
does not hold— not every basis of every subspace
of is made of mutually orthogonal vectors.
However, we can get the partial converse
that for every subspace of there is at least one basis
consisting of mutually orthogonal vectors.
Example 2.4
The members and of this basis for
are not orthogonal.
However, we can derive from
a new basis for the same space that does have mutually orthogonal members.
For the first member of the new basis we simply use .
For the second member of the new basis,
we take away from its part in the direction of
,
which leaves the part, pictured above, of that is orthogonal to (it is orthogonal by the definition of the projection onto the span of ). Note that, by the corollary, is a basis for .
Definition 2.5
An orthogonal basis for a vector space is a basis of mutually orthogonal vectors.
Example 2.6
To turn this basis for
into an orthogonal basis, we take the first vector as it is given.
We get by starting with the given second vector
and
subtracting away the part of it in the direction of .
Finally, we get
by taking the third given vector and subtracting the part of it
in the direction of , and also the part
of it in the direction of .
Again the corollary gives that
is a basis for the space.
The next result verifies that
the process used in those examples works with any basis for any
subspace of an
(we are restricted to only because we have not given a
definition of orthogonality for other vector spaces).
Theorem 2.7 (Gram-Schmidt orthogonalization)
If
is a basis for a subspace of then, where
the 's form an orthogonal basis for the same subspace.
Proof
We will use induction to check that each is nonzero,
is in the span of
and is orthogonal to all preceding vectors:
.
With those, and with
Corollary 2.3, we will have that
is a basis for the same space as
.
We shall cover the cases up to , which give the
sense of the argument.
Completing the details is Problem 15.
The case is trivial— setting equal to
makes it a nonzero vector since is a member of a basis,
it is obviously in the desired span,
and the "orthogonal to all preceding vectors" condition is vacuously met.
For the case, expand the definition of .
This expansion shows that is nonzero or else this
would be a non-trivial linear dependence among the 's
(it is nontrivial because the coefficient of is )
and also shows that
is in the desired span.
Finally, is orthogonal to the only preceding vector
because this projection is orthogonal.
The case is the same as the case except for one detail.
As in the case, expanding the definition
shows that is nonzero and is in the span.
A calculation shows that
is orthogonal to the preceding vector .
(Here's the difference from the case— the second line has
two kinds of terms.
The first term is zero because this projection is orthogonal, as in the
case.
The second term is zero because is orthogonal
to and so is orthogonal to any vector
in the line spanned by .)
The check that is also
orthogonal to the other preceding vector is similar.
Beyond having the vectors in the basis be orthogonal, we can do more; we can arrange for each vector to have length one by dividing each by its own length (we can normalize the lengths).
Example 2.8
Normalizing the length of each vector in the orthogonal basis of
Example 2.6
produces this orthonormal basis.
Besides its intuitive appeal, and its analogy with the
standard basis for , an orthonormal basis also simplifies
some computations.
See Exercise 9, for example.
Perform the Gram-Schmidt process on each of these bases
for .
Then turn those orthogonal bases into orthonormal bases.
This exercise is recommended for all readers.
Problem 2
Perform the Gram-Schmidt process on each of these bases
for .
Then turn those orthogonal bases into orthonormal bases.
This exercise is recommended for all readers.
Problem 3
Find an orthonormal basis for this subspace of : the
plane .
Problem 4
Find an orthonormal basis for this subspace of .
Problem 5
Show that any linearly independent subset of can be
orthogonalized without changing its span.
This exercise is recommended for all readers.
Problem 6
What happens if we apply the Gram-Schmidt process to
a basis that is already orthogonal?
Problem 7
Let
be a set of mutually orthogonal vectors in .
Prove that for any in the space, the vector
is orthogonal to each of , ..., .
Illustrate the prior item in by using as
, using as , and
taking to have components , , and .
Show that is the vector in the
span of the set of 's that is closest to .
Hint. To the illustration done for the prior part,
add a vector
and apply the Pythagorean Theorem to the resulting triangle.
Problem 8
Find a vector in that is orthogonal to both of these.
This exercise is recommended for all readers.
Problem 9
One advantage of orthogonal bases is that they simplify finding the
representation of a vector with respect to that basis.
For this vector and this non-orthogonal basis for
first represent the vector with respect to the basis.
Then project the vector onto the span of each basis vector
and .
With this orthogonal basis for
represent the same vector with respect to the basis.
Then project the vector onto the span of each basis vector.
Note that the coefficients in the representation and the projection
are the same.
Let
be an orthogonal basis for some subspace of .
Prove that for any in the subspace,
the -th component of the representation
is the scalar coefficient
from .
Prove that
.
Problem 10
Bessel's Inequality.
Consider these orthonormal sets
along with the vector whose components are
, , , and .
Find the coefficient for the projection of
onto the span of the vector in .
Check that .
Find the coefficients and for the projection of
onto the spans of the two vectors in .
Check that .
Find , , and associated with the vectors in
, and , , , and for the vectors in .
Check that
and
that
.
Show that this holds in general: where
is an orthonormal set and is coefficient of
the projection of a vector from the space
then
.
Hint. One way is to look at the inequality
and expand the 's.
Problem 11
Prove or disprove: every vector in is in some orthogonal
basis.
Problem 12
Show that the columns of an matrix form an orthonormal
set if and only if the inverse of the matrix is its transpose.
Produce such a matrix.
Problem 13
Does the proof of Theorem 2.2 fail to consider the
possibility that the set of vectors is empty (i.e., that )?
Problem 14
Theorem 2.7 describes a change of basis
from any basis
to one that is orthogonal
.
Consider the change of basis matrix .
Prove that the matrix
changing bases in the direction opposite to that of the theorem
has an upper triangular shape— all
of its entries below the main diagonal are zeros.
Prove that the inverse of an upper triangular matrix is
also upper triangular (if the matrix is invertible, that is).
This shows that the matrix changing bases
in the direction described in the theorem is upper triangular.
Problem 15
Complete the induction argument in the proof of
Theorem 2.7.