Linear Algebra/Vector Spaces And Subspaces
A vector space is a way of generalizing the concept of a set of vectors. For example, the complex number 2+3i can be considered a vector, since in some way it is the vector .
The vector space is a "space" of such abstract objects, which we term "vectors".
Some familiar friends
editCurrently in our study of vectors we have looked at vectors with real entries: , and so on. These are all vector spaces. The advantage we gain in abstracting to vector spaces is a way of talking about a space without any particular choice of objects (which define our vectors), operations (which act on our vectors), or coordinates (which identify our vectors in the space). Further results may be applied to more general spaces which may have infinite dimension, such as in Functional Analysis.
Notations and concepts
editWe write a vector, like we have before, bold, but you should write these on paper underlined or with an arrow on top. So we write for that vector.
When we multiply a vector by a scalar number, we usually ascribe it a Greek letter, writing λv for the multiplication of v by a scalar λ. We write addition and subtraction of vectors as we have been doing before, x+y for the sum of vectors x and y.
With scalar multiplication and adding vectors, we can move to our definition of a vector space.
When we refer to an operation being 'closed' in a definition, we are saying that the result of the operation does not violate our definition. For example, if we are looking at the set of all integers, we can say that it is closed under addition, because adding any integers results in something inside the set of integers. However the set of integers is not closed under division, because dividing 3 by 2 (for example) doesn't result in a member of the set of integers.
Definition
editA vector space is a nonempty set V of objects, called vectors on which are defined two operations, called vector addition and scalar multiplication, respectively, are defined such that, for and α , x+y and αx are well defined elements of V with the following properties:
- commutativity of addition: x+y=y+x
- associativity of addition: x+(y+z)=(x+y)+z
- additive identity: there is a vector 0 such that 0+x=x for all x
- additive inverse: for each vector x, there exists another vector y such that x+y =0
- scalar associativity: α(βx) = (αβ)x
- scalar distributivity: (α + β)x=αx+βx
- vector distributivity: α(x+y)=αx+αy
- scalar identity: 1x=x
Alternative Definition
editPeople who are familiar with group theory and field theory may find the following alternative definition more compact:
- is an Abelian group.
Subspaces
editA subspace is a vector space inside a vector space. When we look at various vector spaces, it is often useful to examine their subspaces.
The subspace S of a vector space V is that S is a subset of V and that it has the following key characteristics
- S is closed under scalar multiplication: if λ∈R, v∈S, λv∈S
- S is closed under addition: if u, v ∈ S, u+v∈S.
- S contains 0, the zero vector.
Any subset with these characteristics is a subspace.
Examples
editLet us examine some subspaces of some familiar vector spaces, and see how we can prove that a certain subset of a vector space is in fact a subspace.
The trivial subspace
editIn R2, the set containing the zero vector ({0}) is a subspace in R2.
Scalar multiplication closure: a 0=0 for all a in R
Addition closure: 0+0=0. Since 0 is the only member of the set so we only need to check 0
Zero vector: 0 is the only member of the set and it is the zero vector.
A slightly less trivial subspace
editIn R2, the set V of all vectors from R2 of the form (0,α) where α is in R is a subspace
Scalar multiplication closure: a (0,α) = (0,a α) and a α is in R
Addition closure: (0,α) +(0,β) =(0, α + β) and α + β is in R
Zero vector: taking α to be zero in our definition of (0, α) in V we get the zero vector (0,0)
A whole family of subspaces
editPick any number from R, say ρ. Then the set V of all vectors of the form (α, ρα) is a subspace of R2
Scalar multiplication closure: a (α, ρα) = (aα, ρaα) which is in V.
Addition closure: (α, ρα) +(β, ρβ) =(α + β, ρα + ρβ) = (α+β, ρ(α+β)) which is in V
Zero vector: taking α to be zero in our definition we get (0, ρ0) = (0,0) in V.
That means V2 = the set of all vectors of the form (α,2α) is a subspace of R2
and V3 = the set of all vectors of the form (α,3α) is a subspace of R2
and V4 = the set of all vectors of the form (α,4α) is a subspace of R2
and V5 = the set of all vectors of the form (α,5α) is a subspace of R2
and Vπ = the set of all vectors of the form (α,πα) is a subspace of R2
and V√2 = the set of all vectors of the form is a subspace of R2
As you can see, even a simple vector space like R2 can have many different subspaces.
Linear Combinations, Spans and Spanning Sets, Linear Dependence, and Linear
editLinear Combinations
editDefinition: Assume is a vector space over a field and is a nonempty subset of . Then a vector is said to be a linear combination of elements of if there exists a finite number of elements and such that .
Spans
editDefinition: Assume is a vector space over a field . The set of all linear combinations of is called the span of . This is sometimes denoted by .
Note that is a subspace of .
Proof: Consider closure under addition and scalar multiplication for two vectors, x and y, in the span of the vectors
, which is also contained in the set.
, which is also contained in the set.
Spanning Sets
editDefinition: Assume is a vector space over a field and are vectors in such a vector space. The set is a spanning set for the vector space if and only if every vector in is a linear combination of . Alternately,
Linear Independence
editDefinition: Assume is a vector space over a field and is a finite subset of . Then we say is linearly independent if implies . Linear independence is a very important topic in Linear Algebra. The definition implies that linearly dependent vectors may form the nulvector as a non-trivial combination, from which we may conclude that one of the vectors can be expressed as a linear combination of the others.
If we have a vector space V spanned by 3 vectors we say that v1, v2, and v3 are linearly dependent if there is a combination of one or two of them that can produce a third. For instance, if one of the following equations:
can be satisfied, then the vectors in V are said to be linearly dependant.
How can we test for linear independence? The definition sets it out to us: If V is a vector space spanned by 3 vectors of length N:
and we try to test whether these 3 vectors are linearly independent, we form the equations:
and solve them. If the only solution is
then the 3 vectors are linearly independent. If there is another solution they are linearly dependent.
??????
We can say that for V to be linearly independent it must satisfy this condition:
Where we are using 0 to denote the null vector in V. If is square and invertable, we can solve this equation directly:
And if we know that is zero, then we know that the system is linearly independent. If, however, is not square, or if it is not invertable, we can try the following technique:
Multiply through by the transpose matrix:
Find the inverse of , and multiply through by the inverse:
Cancel the terms:
And our conclusion:
This again means that V is linearly independent.
Span
editA span is the set of all possible linear combination of elements of a vector space.(Linear Closure)
Basis
editA basis for a vector space is the least amount of linearly independent vectors that can be used to describe the vector space completely. The most common basis vectors are the kronecker vectors, also called canonical basis:
In the cartesian graphing space, we say an ordered triple of coordinates is defined as:
And we can make any point (x, y, z) by combining the kronecker basis vectors:
Some theorems:
- A basis of a vector space V has the maximal number of linearly independent vectors.
- (Converse) A maximal number of linearly independent vectors in a vector space is a basis.
Bases and Dimension
editIf a vector space V is such that:
it contains a linearly independent set B of N vectors,and
any set of N + 1 or more vectors in V is linearly dependent,
then V is said to have dimension N, and B is said to be a basis of V.
External links
edit- Online interactive exercises on vector spaces.
TODO
editTell about what is a basis in a vector space and about coordinate transformations. (this article contains an abstract definition of a basis which is a generalization of a basis in vector space and can be used as the foundation to explain about bases and coordinate transformations.)
Discuss the geometry of subspaces (points, lines, planes, hypersurfaces) and connect them to the geometry of solutions of linear systems. Connect the algebra of subspaces and linear combinations of vectors to the algebra of linear systems.
RECOMMENDATION: extend it to 4D