# Real Analysis/Vectors

In mathematics, you may have gotten used to variables that represent functions, sets, or numbers. Although we have rigorously defined the nature of assigning variables with functions and, to a lesser extent, sets, and the associating operations one can do with them, we have not focused on how variables can represent numbers. Usually, we only imagine numbers simply not as only being one possible value (which the concept of plus-minus being the first way to openly question this concept), but also of only being "one element". Let us ask, "What if we can have more than one number in a variable and operate on that?" You arrive here, on the topic of vectors.

## Definition

Definition of a Vector
A collection of numbers.

Let's suppose that you want to create a new system for these vectors. Given a variable v, you may first decide to create some new notation to indicate that this variable is a vector, separate from functions or numbers.

v = (v1, v2) or w = (w1, w2)

Vectors, as not necessarily sets, are noted using round brackets. Each element in those brackets can have a number in it. The first thing to note is that this notation conflicts with Euclidean graphing coordinates, which are also represented using round brackets and numbers inside them. This is true! As you will see later in this page, there is a reason to this shared notation.

## Operations

First, we must define some operation we can do with this definition, lest it becomes useless. Luckily, the standard operation for vectors is addition, a nice and simple operation. We can define vector addition between two vectors v and w in this manner

v + w = (v1 + w1, v2 + w2).

Why? When creating new definitions, we can make it whatever we want. Sure, this definition is simple, but it doesn't look that interesting, does it? That is true, our definition is not something interesting like v + w = ((v1 + w2)/w1, (v2 + w1)/w2), but that simple definition already contains familiar properties we commonly associate with addition.

The complete list of these properties are outlined below:

 Associative u + (v + w) = (u + v) + w Commutative v + w = w + v Identity ∃0 : v + 0 = v Inverse ∃w : v + w = 0

Luckily, because the definition of a vector relies on arithmetic that has already been defined, it means that our definition of vector addition can rely on no new axioms! Let's prove each property as true.

#### Proof

The first proof for Vector Addition is the property of association. This straightforward proof will rely on arithmetic addition.

 First, we will apply the definition of vector addition to the left side of the equation. We will apply it twice, by the way. We know that addition is associative already. So, we can swap the brackets and apply the definition inversely a.k.a. work backwards {\displaystyle {\begin{aligned}{\vec {u}}+({\vec {v}}+{\vec {w}})&=(u_{1},u_{2})+(v_{1}+w_{1},v_{2}+w_{2})\\&=(u_{1}+[v_{1}+w_{1}],u_{2}+[v_{2}+w_{2}])\\&=([u_{1}+v_{1}]+w_{1},[u_{2}+v_{2}]+w_{2})\\&=(u_{1}+v_{1},u_{2}+v_{2})+(w_{1},w_{2})\\&=({\vec {u}}+{\vec {v}})+{\vec {w}}\end{aligned}}} ${\displaystyle \blacksquare }$

For commutation, the proof is similar to the previous one.

 First, we will apply the definition of vector addition to the left side of the equation. We know that addition is commutative already. So, we can swap the variable and apply the definition inversely a.k.a. work backwards {\displaystyle {\begin{aligned}{\vec {u}}+{\vec {v}}&=(u_{1}+v_{1},u_{2}+v_{2})\\&=(v_{1}+u_{1},v_{2}+u_{2})\\&={\vec {v}}+{\vec {u}}\end{aligned}}} ${\displaystyle \blacksquare }$

For identity, the proof is similar to the previous one.

 First, we will apply the definition of vector addition to the left side of the equation. We know that addition has an identity already. So, we can use it and apply the definition inversely a.k.a. work backwards {\displaystyle {\begin{aligned}{\vec {v}}&=(v_{1},v_{2})\\&=(v_{1}+0,v_{2}+0)\\&=(v_{1}+0_{1},v_{2}+0_{2})\\&={\vec {v}}+{\vec {0}}\end{aligned}}} ${\displaystyle \blacksquare }$