# Linear Algebra/OLD/Vector Spaces

A vector space is a way of generalizing the concept of a set of vectors. For example, the complex number 2+3i can be considered a vector, since in some way it is the vector ${\displaystyle {\begin{bmatrix}2\\3\end{bmatrix}}}$.

The vector space is a "space" of such abstract objects, which we term "vectors".

## Some familiar friends

Currently in our study of vectors we have looked at vectors with real entries: ${\displaystyle \mathbb {R} ^{2},\mathbb {R} ^{3},...,\mathbb {R} ^{n}}$ , and so on. These are all vector spaces. The advantage we gain in abstracting to vector spaces is a way of talking about a space without any particular choice of objects (which define our vectors), operations (which act on our vectors), or coordinates (which identify our vectors in the space). Further results may be applied to more general spaces which may have infinite dimension, such as in Functional Analysis.

## Notations and concepts

We write a vector, like we have before, bold, but you should write these on paper underlined or with an arrow on top. So we write ${\displaystyle \mathbf {v} ={\begin{pmatrix}2\\3\end{pmatrix}}}$  for that vector.

When we multiply a vector by a scalar number, we usually ascribe it a Greek letter, writing λv for the multiplication of v by a scalar λ. We write addition and subtraction of vectors as we have been doing before, x+y for the sum of vectors x and y.

With scalar multiplication and adding vectors, we can move to our definition of a vector space.

When we refer to an operation being 'closed' in a definition, we are saying that the result of the operation does not violate our definition. For example, if we are looking at the set of all integers, we can say that it is closed under addition, because adding any integers results in something inside the set of integers. However the set of integers is not closed under division, because dividing 3 by 2 (for example) doesn't result in a member of the set of integers.

## Definition

A vector space is a nonempty set V of objects, called vectors, on which are defined two operations, called vector addition and scalar multiplication, respectively, such that, for ${\displaystyle x,y\in V}$  and α${\displaystyle \in F}$ , where F is a field x+y and αx are well-defined elements of V with the following properties:

3. additive identity: there is a vector 0 such that 0+x=x for all x
4. additive inverse: for each vector x, there exists another vector y such that x+y =0
1. scalar associativity: α(βx) = (αβ)x
2. scalar distributivity: (α + β)xxx
3. vector distributivity: α(x+y)=αxy
4. scalar identity: 1x=x

### Alternative Definition

People who are familiar with group theory and field theory may find the following alternative definition more compact:

• ${\displaystyle (V,+)}$  is an Abelian group.
• ${\displaystyle \forall \alpha \in F,(\forall x,y\in V),\alpha (x+y)=\alpha x+\alpha y}$
• ${\displaystyle \forall \alpha ,\beta \in F,\forall x\in V,(\alpha +\beta )x=\alpha x+\beta x}$
• ${\displaystyle \forall \alpha ,\beta \in F,\forall x\in V,(\alpha \beta )x=\alpha (\beta x)}$
• ${\displaystyle \forall x\in V,1x=x}$

### Some Basic Theorems

1. The 0 vector is unique.
Proof: Let 01 and 02 both be 0 vectors. Then 01=01+02=02.
2. The additive inverse is unique.
Proof: Suppose there is a x and y1 and a y2. are inverses of x Then y1=y1+(x+y2)=(y1+x)+y2=y2.
3. 0x=0.
Proof: Let y be the additive inverse of x. Then 0x= 0x + x + y = (0+1)x + y = x+y=0.
4. (-1)x is the inverse of x.
Proof: x+(-1)x=(1-1)x=0x=0.

## Linear Spaces

The linear space is a very important vector space. Let n1, n2, n3, ..., nk be k elements of a field F. Then the ordered k-tuples (n1, n2, n3, ..., nk) form a vector space with addition being the sum of the corresponding numbers, and scalar multiplication by an element of F being the result of multiplying each one in the k-tuple. This would then be the k dimensional linear space.

## Subspaces

A subspace is a vector space inside a vector space. When we look at various vector spaces, it is often useful to examine their subspaces.

The subspace S of a vector space V means that S is a subset of V and that it has the following key characteristics

• S is closed under scalar multiplication: if λ∈R, v∈S, λv∈S
• S is closed under addition: if u, v ∈ S, u+v∈S.

Any subset with these characteristics is a vector space.

#### The trivial subspace

The singleton set with the zero vector ({0}) is a subspace of every vector space.

Scalar multiplication closure: a 0=0 for all a in R

Addition closure: 0+0=0. Since 0 is the only member of the set we need to check only 0

Zero vector: 0 is the only member of the set and it is the zero vector.

### Examples

Let us examine some subspaces of some familiar vector spaces, and see how we can prove that a certain subset of a vector space is in fact a subspace.

#### A slightly less trivial subspace

In R2, the set V of all vectors from R2 of the form (0,α) where α is in R is a subspace

Scalar multiplication closure: a (0,α) = (0,a α) and a α is in R

Addition closure: (0,α) +(0,β) =(0, α + β) and α + β is in R

Zero vector: taking α to be zero in our definition of (0, α) in V we get the zero vector (0,0)

#### A whole family of subspaces

Pick any number from R, say ρ. Then the set V of all vectors of the form (α, ρα) is a subspace of R2

Scalar multiplication closure: a (α, ρα) = (aα, ρaα) which is in V.

Addition closure: (α, ρα) +(β, ρβ) =(α + β, ρα + ρβ) = (α+β, ρ(α+β)) which is in V

Zero vector: taking α to be zero in our definition we get (0, ρ0) = (0,0) in V.

That means V2 = the set of all vectors of the form (α,2α) is a subspace of R2

and V3 = the set of all vectors of the form (α,3α) is a subspace of R2

and V4 = the set of all vectors of the form (α,4α) is a subspace of R2

and V5 = the set of all vectors of the form (α,5α) is a subspace of R2

and Vπ = the set of all vectors of the form (α,πα) is a subspace of R2

and V√2 = the set of all vectors of the form ${\displaystyle (\alpha ,{\sqrt {2}}\alpha )}$  is a subspace of R2

As you can see, even a simple vector space like R2 can have many different subspaces.

## Linear Combinations, Spans and Spanning Sets, Linear Dependence, and Linear Independence

### Linear Combinations

Definition: Assume ${\displaystyle V}$  is a vector space over a field ${\displaystyle (F,+,\cdot )}$  and ${\displaystyle S}$  is a nonempty subset of ${\displaystyle V}$ . Then a vector ${\displaystyle x\in V}$  is said to be a linear combination of elements of ${\displaystyle S}$  if there exists a finite number of elements ${\displaystyle y_{1},y_{2},...,y_{n}\in S}$  and ${\displaystyle a_{1},a_{2},...,a_{n}\in F}$  such that ${\displaystyle x=a_{1}y_{1}+a_{2}y_{2}+...a_{n}y_{n}}$ .

### Spans

Definition: Assume ${\displaystyle V}$  is a vector space over a field ${\displaystyle (F,+,\cdot )}$ . The set of all linear combinations of ${\displaystyle y_{1},y_{2},...,y_{n}\in V}$  is called the span of ${\displaystyle y_{1},y_{2},...,y_{n}}$ . This is sometimes denoted by ${\displaystyle Span(y_{1},y_{2},...,y_{n})}$ .

Note that ${\displaystyle Span(y_{1},y_{2},...,y_{n}\in V)}$  is a subspace of ${\displaystyle V}$ .

Proof: Consider closure under addition and scalar multiplication for two vectors, x and y, in the span of the vectors ${\displaystyle {v_{1},v_{2},...,v_{n}}.}$

${\displaystyle x=a_{1}v_{1}+a_{2}v_{2}+...+a_{n}v_{n}}$

${\displaystyle y=b_{1}v_{1}+b_{2}v_{2}+...+b_{n}v_{n}}$

${\displaystyle x+y=(a_{1}+b_{1})*v_{1}+(a_{2}+b_{2})*v_{2}+...+(a_{n}+b_{n})*v_{n}}$ , which is also contained in the set.

${\displaystyle k*x=(k*a_{1})v_{1}+(k*a_{2})v_{2}+...+(k*a_{n})v_{n}}$ , which is also contained in the set.

#### Spanning Sets

Definition: Assume ${\displaystyle V}$  is a vector space over a field ${\displaystyle (F,+,\cdot )}$  and ${\displaystyle y_{1},y_{2},...,y_{n}}$  are vectors in such a vector space. The set ${\displaystyle {y_{1},y_{2},...,y_{n}}}$  is a spanning set for the vector space ${\displaystyle V}$  if and only if every vector in ${\displaystyle V}$  is a linear combination of ${\displaystyle y_{1},y_{2},...,y_{n}}$ . Alternately, ${\displaystyle \forall x\in V,(\exists a_{1},a_{2},...,a_{n}\in F),x=a_{1}y_{1}+a_{2}y_{2}+...+a_{n}y_{n}}$

### Linear Independence

Definition: Assume ${\displaystyle V}$  is a vector space over a field ${\displaystyle (F,+,\cdot )}$  and ${\displaystyle S=\{x_{1},x_{2},...,x_{n}\}}$  is a finite subset of ${\displaystyle V}$ . Then we say ${\displaystyle S}$  is linearly independent if ${\displaystyle a_{1}x_{1}+a_{2}x_{2}+...a_{n}x_{n}=0}$  implies ${\displaystyle a_{1}=a_{2}=...=a_{n}=0}$ .

Linear independence is a very important topic in Linear Algebra. The definition implies that linearly dependent vectors may form the nulvector as a non-trivial combination, from which we may conclude that one of the vectors can be expressed as a linear combination of the others.

If we have a vector space V spanned by 3 vectors ${\displaystyle {v_{1},v_{2},v_{3}}}$  we say that v1, v2, and v3 are linearly dependent if there is a combination of one or two of them that can produce a third. For instance, if one of the following equations:

${\displaystyle a_{1}v_{1}+a_{2}v_{2}=v_{3}}$
${\displaystyle a_{2}v_{2}+a_{3}v_{3}=v_{1}}$
${\displaystyle a_{1}v_{1}+a_{3}v_{3}=v_{2}}$

can be satisfied, then the vectors in V are said to be linearly dependant.

How can we test for linear independence? The definition sets it out to us: If V is a vector space spanned by 3 vectors of length N:

${\displaystyle {\tilde {V}}=[v_{1},v_{2},v_{3}]}$

and we try to test whether these 3 vectors are linearly independent, we form the equations:

${\displaystyle a_{1}v_{1}+a_{2}v_{2}+a_{3}v_{3}=0\,}$

and solve them. If the only solution is

${\displaystyle a_{1}=a_{2}=a_{3}=0,\,}$

then the 3 vectors are linearly independent. If there is another solution they are linearly dependent.

?????? We can say that for V to be linearly independent it must satisfy this condition:

${\displaystyle {\tilde {V}}{\bar {a}}=0}$

Where we are using 0 to denote the null vector in V. If ${\displaystyle {\tilde {V}}}$  is square and invertable, we can solve this equation directly:

${\displaystyle {\tilde {V}}^{-1}{\tilde {V}}{\bar {a}}={\bar {a}}={\tilde {V}}^{-1}\cdot 0}$
${\displaystyle {\bar {a}}=0}$

And if we know that ${\displaystyle {\bar {a}}}$  is zero, then we know that the system is linearly independent. If, however, ${\displaystyle {\tilde {V}}}$  is not square, or if it is not invertable, we can try the following technique:

Multiply through by the transpose matrix:

${\displaystyle {\tilde {V}}^{T}{\tilde {V}}{\bar {a}}=0}$

Find the inverse of ${\displaystyle [{\tilde {V}}^{T}{\tilde {V}}}$ , and multiply through by the inverse:

${\displaystyle [{\tilde {V}}^{T}{\tilde {V}}]^{-1}{\tilde {V}}^{T}{\tilde {V}}{\bar {a}}=[{\tilde {V}}^{T}{\tilde {V}}]^{-1}\cdot 0}$

Cancel the terms:

${\displaystyle {\bar {a}}={\tilde {V}}^{T}{\tilde {V}}\cdot 0}$

And our conclusion:

${\displaystyle {\bar {a}}=0}$

This again means that V is linearly independent.

### Span

A span is the set of all possible vectors that are in a given vector space.

### Basis

A basis for a vector space is the least amount of linearly independent vectors that can be used to describe the vector space completely. The most common basis vectors are the kronecker vectors, also called canonical basis:

${\displaystyle i={\begin{bmatrix}1\\0\\0\end{bmatrix}}}$ ${\displaystyle j={\begin{bmatrix}0\\1\\0\end{bmatrix}}}$ ${\displaystyle k={\begin{bmatrix}0\\0\\1\end{bmatrix}}}$

In the cartesian graphing space, we say an ordered triple of coordinates is defined as:

${\displaystyle v={\begin{bmatrix}x\\y\\z\end{bmatrix}}}$

And we can make any point (x, y, z) by combining the kronecker basis vectors:

${\displaystyle {\begin{bmatrix}x\\y\\z\end{bmatrix}}=xi+yj+zk}$

Some theorems:

• A basis ${\displaystyle {\mathcal {B}}(v_{1},\ldots ,v_{n})}$  of a vector space V has the maximal number of linearly independet vectors.
• (Converse) A maximal number of linearly independent vectors in a vector space is a basis.

## Bases and Dimension

If a vector space V is such that:
it contains a linearly independent set B of N vectors, and

any set of N + 1 or more vectors in V is linearly dependent,

then V is said to have dimension N, and B is said to be a basis of V.