In the prior subsection we defined the basis of a vector space, and we
saw that a space can have many different bases. For example, following
the definition of a basis, we saw three different bases for
. So we cannot talk about "the" basis for a
vector space. True, some vector spaces have bases that strike us as
more natural than others, for instance, 's
basis or 's basis
or 's basis
. But, for example in the space
, no particular
basis leaps out at us as the most natural one. We cannot, in general,
associate with a space any single basis that best describes that
We can, however, find something about the bases that is uniquely associated with the space. This subsection shows that any two bases for a space have the same number of elements. So, with each space we can associate a number, the number of vectors in any of its bases.
This brings us back to when we considered the two things that could be meant by the term "minimal spanning set". At that point we defined "minimal" as linearly independent, but we noted that another reasonable interpretation of the term is that a spanning set is "minimal" when it has the fewest number of elements of any set with the same span. At the end of this subsection, after we have shown that all bases have the same number of elements, then we will have shown that the two senses of "minimal" are equivalent.
Before we start, we first limit our attention to spaces where at least one basis has only finitely many members.
A vector space is finite-dimensional if it has a basis with only finitely many vectors.
(One reason for sticking to finite-dimensional spaces is so that the representation of a vector with respect to a basis is a finitely-tall vector, and so can be easily written.) From now on we study only finite-dimensional vector spaces. We shall take the term "vector space" to mean "finite-dimensional vector space". Other spaces are interesting and important, but they lie outside of our scope.
To prove the main theorem we shall use a technical result.
Lemma 2.2 (Exchange Lemma)
Assume that is a basis for a vector space, and that for the vector the relationship has . Then exchanging for yields another basis for the space.
Call the outcome of the exchange .
We first show that is linearly independent. Any relationship among the members of , after substitution for ,
gives a linear relationship among the members of . The basis is linearly independent, so the coefficient of is zero. Because is assumed to be nonzero, . Using this in equation above gives that all of the other 's are also zero. Therefore is linearly independent.
We finish by showing that has the same span as . Half of this argument, that , is easy; any member of can be written , which is a linear combination of linear combinations of members of , and hence is in . For the half of the argument, recall that when with , then the equation can be rearranged to . Now, consider any member of , substitute for its expression as a linear combination of the members of , and recognize (as in the first half of this argument) that the result is a linear combination of linear combinations, of members of , and hence is in .
In any finite-dimensional vector space, all of the bases have the same number of elements.
Fix a vector space with at least one finite basis. Choose, from among all of this space's bases, one of minimal size. We will show that any other basis also has the same number of members, . Because has minimal size, has no fewer than vectors. We will argue that it cannot have more than vectors.
The basis spans the space and is in the space, so is a nontrivial linear combination of elements of . By the Exchange Lemma, can be swapped for a vector from , resulting in a basis , where one element is and all of the other elements are 's.
The prior paragraph forms the basis step for an induction argument. The inductive step starts with a basis (for ) containing members of and members of . We know that has at least members so there is a . Represent it as a linear combination of elements of . The key point: in that representation, at least one of the nonzero scalars must be associated with a or else that representation would be a nontrivial linear relationship among elements of the linearly independent set . Exchange for to get a new basis with one more and one fewer than the previous basis .
Repeat the inductive step until no 's remain, so that contains . Now, cannot have more than these vectors because any that remains would be in the span of (since it is a basis) and hence would be a linear combination of the other 's, contradicting that is linearly independent.
The dimension of a vector space is the number of vectors in any of its bases.
Any basis for has vectors since the standard basis has vectors. Thus, this definition generalizes the most familiar use of term, that is -dimensional.
The space of polynomials of degree at most has dimension . We can show this by exhibiting any basis— comes to mind— and counting its members.
A trivial space is zero-dimensional since its basis is empty.
Again, although we sometimes say "finite-dimensional" as a reminder, in the rest of this book all vector spaces are assumed to be finite-dimensional. An instance of this is that in the next result the word "space" should be taken to mean "finite-dimensional vector space".
No linearly independent set can have a size greater than the dimension of the enclosing space.
Inspection of the above proof shows that it never uses that spans the space, only that is linearly independent.
Recall the subspace diagram from the prior section showing the subspaces of . Each subspace shown is described with a minimal spanning set, for which we now have the term "basis". The whole space has a basis with three members, the plane subspaces have bases with two members, the line subspaces have bases with one member, and the trivial subspace has a basis with zero members. When we saw that diagram we could not show that these are the only subspaces that this space has. We can show it now. The prior corollary proves that the only subspaces of are either three-, two-, one-, or zero-dimensional. Therefore, the diagram indicates all of the subspaces. There are no subspaces somehow, say, between lines and planes.
Any linearly independent set can be expanded to make a basis.
If a linearly independent set is not already a basis then it must not span the space. Adding to it a vector that is not in the span preserves linear independence. Keep adding, until the resulting set does span the space, which the prior corollary shows will happen after only a finite number of steps.
Any spanning set can be shrunk to a basis.
Call the spanning set . If is empty then it is already a basis (the space must be a trivial space). If then it can be shrunk to the empty basis, thereby making it linearly independent, without changing its span.
Otherwise, contains a vector with and we can form a basis . If then we are done.
If not then there is a such that . Let ; if then we are done.
We can repeat this process until the spans are equal, which must happen in at most finitely many steps.
In an -dimensional space, a set of vectors is linearly independent if and only if it spans the space.
First we will show that a subset with vectors is linearly independent if and only if it is a basis. "If" is trivially true— bases are linearly independent. "Only if" holds because a linearly independent set can be expanded to a basis, but a basis has elements, so this expansion is actually the set that we began with.
To finish, we will show that any subset with vectors spans the space if and only if it is a basis. Again, "if" is trivial. "Only if" holds because any spanning set can be shrunk to a basis, but a basis has elements and so this shrunken set is just the one we started with.
The main result of this subsection, that all of the bases in a finite-dimensional vector space have the same number of elements, is the single most important result in this book because, as Example 2.9 shows, it describes what vector spaces and subspaces there can be. We will see more in the next chapter.
The case of infinite-dimensional vector spaces is somewhat controversial. The statement "any infinite-dimensional vector space has a basis" is known to be equivalent to a statement called the Axiom of Choice (see (Blass 1984).) Mathematicians differ philosophically on whether to accept or reject this statement as an axiom on which to base mathematics (although, the great majority seem to accept it). Consequently the question about infinite-dimensional vector spaces is still somewhat up in the air. (A discussion of the Axiom of Choice can be found in the Frequently Asked Questions list for the Usenet group sci.math. Another accessible reference is (Rucker 1982).
Where is a set, the functions form a vector space under the natural operations: the sum is the function given by and the scalar product is given by . What is the dimension of the space resulting for each domain?
(See Problem 11.) Prove that this is an infinite-dimensional space: the set of all functions under the natural operations.
(See Problem 11.)
What is the dimension of the vector space of functions , under the natural operations, where the domain is the empty set?
Show that any set of four vectors in is linearly dependent.
Show that the set is a basis if and only if there is no plane through the origin containing
all three vectors.
Prove that any subspace of a finite dimensional space has a basis.
Prove that any subspace of a finite dimensional space is finite dimensional.
Prove that if and are both three-dimensional subspaces of then is non-trivial. Generalize.
Because a basis for a space is a subset of that space, we are naturally led to how the property "is a basis" interacts with set operations.
Consider first how bases might be related by "subset". Assume that are subspaces of some vector space and that . Can there exist bases for and for such that ? Must such bases exist?
For any basis for , must there be a basis for such that ?
For any basis for , must there be a basis for such that ?
For any bases for and , must be a subset of ?
Is the intersection of bases a basis? For what space?
Is the union of bases a basis? For what space?
What about complement?
(Hint. Test any conjectures against some subspaces of .)
This exercise is recommended for all readers.
Consider how "dimension" interacts with "subset". Assume and are both subspaces of some vector space, and that .
Prove that .
Prove that equality of dimension holds if and only if .
Show that the prior item does not hold if they are infinite-dimensional.
? Problem 21
For any vector in and any permutation of the numbers , , ..., (that is, is a rearrangement of those numbers into a new order), define to be the vector whose components are , , ..., and (where is the first number in the rearrangement, etc.). Now fix and let be the span of . What are the possibilities for the dimension of ? (Gilbert, Krusemeyer & Larson 1993, Problem 47)