Linear Algebra/Subspaces

Suppose V is a vector space over a field F. A nonempty set of vectors H within the vector space V is defined to be a subspace of V when:

  1. If u and v are elements of H then u+v must also be an element of H
  1. If u is an element of H and c is an element of F, then cu is also an element of H

Where addition and scalar multiplication in H is defined to be the same as addition and scalar multiplication in V when all vectors involved are within H.

A subspace of any vector space is also a vector space because it is obviously true that when x, y and z are elements of H and c and d are elements of F, it is obviously true that x+y=y+x and (x+y)+z=x+(y+z) because they are elements of V, and also 1x=x, c(dx)=(cd)x, (a+b)x=ax+bx, a(x+y)=ax+ay because x and y are elements of V, and by the second condition, for every x, there exists a 0x within the set, which is the identity, so its identity is within the subspace, and also by the second condition, for every x, there exists a (-1)x, which is its inverse. Thus, all subspaces of a vector space are also vector spaces.

Properties of Subspaces

edit

If a set of vectors are in a subspace H of a vector space V, and the vectors are linearly independent in V, then they are also linearly independent in H. This implies that the dimension of H is less than or equal to the dimension of V.

Also, for every basis in a subspace H, there exists a basis in V which contains that basis of H. This follows from the completion theorem proven earlier.

Dimension

edit

A set of vectors v1, v2, v3, ..., vn in a vector space V over a field F is said to be linearly independent over a subspace H when, for elements a1, a2, a3, ..., an within the field F, a1v1+a2v2+a3v3+...+anvn is an element of H only when a1, a2, a3, ..., an all equal 0. Linear independence over the subspace containing only the 0 vector is obviously the same as ordinary linear independence. The maximum number of vectors in V which are linearly independent over H is defined to be the dimension of V over H.

If a set of vectors O in V are linearly independent over H and a set of vectors I are linearly independent within H, then they union of those two sets is also linearly independent. This is because if there is a linear combination of those vectors, then it can equal 0 only when the linear combination of the vectors in O is the opposite of the linear combination of the vectors of I, which is also within H, implying that the coefficients of the elements of O are all 0, but which implies that all elements of I are also 0.

If, given any subspace H of a vector space V, one has a basis B for H, and a basis C of V containing B, then the elements of C-B are linearly independent over H since any element of H must be linearly dependent on elements of B (since it is a basis of H), and since the elements of C-B are all linearly independent to the elements of C-B, any linear combination of them in which not all coefficients are 0 must not be an element of H, thus proving that they are all linearly independent over H. Thus, if a vector space has dimension d, and a subspace has dimension s, the dimension of a vector space over that subspace is d-s.