## Section 2.1 What is a Vector Space

¶### Subsection 2.1.1 Something Old, Something New

¶Long ago an far away you learned to plot points in two dimensions with \(x\) and \(y\) coordinates:

But, how can we understand this in terms of vectors and how can we most efficiently describe infinitely many points and vectors?

The simplest thing to do would be to have a single vector like \(\vec{v}=\left[ 2,3\right]\text{,}\) but with infinitely many points to choose from this isn't very efficient. A little better is to recognize that that can be written as a sum of two vectors like so

Best of all is to realize that this is a linear combination of our elementary vectors

Use what you learned in Section 1.4 to recreate the image above in the Sage Cell below. Then adjust the Sage code in the cell to create similar pictures for the points \((1,3)\text{,}\) \((-2,1)\text{,}\) and \((-3,-2)\text{.}\) Are you convinced that you can always make this sort of picture? Can you tell that this mens you can always write any vector as a combination of copies of the elementary vectors?

Because we can always write any vector (or describe steps to any point) using copies of elementary vectors \(\vec{e}_1=[1,0]\) and \(\vec{e}_1=[1,0]\) we say that they form a basis, which we will explore in detail in the next section. For now what this tells us is that we can describe all the points in \(\mathbb{R}^2\) as the set

### Subsection 2.1.2 It's Not All About the Arrows

¶Consider the set of all quadratic polynomials

which you should hopefully recognize as all the possible parabolas. If we take two of these, \(v(x)=2x^2+3x\) and \(u(x)=-x^2+5\text{,}\) we can add and subtract them

we can multiply them by constants

and in fact we can take any linear combinations we would like

All of this means that we can manipulate these polynomials just like we vectors. In fact if you look back at Subsection 1.2.2 you will see that the coefficients we used for \(u(x)\) and \(v(x)\) are the same as the entries we had in \(\vec{u}\) and \(\vec{v}\text{.}\)

We can take the analogy even further by setting up systems of equations involving polynomials. For example can we find scalars \(a\) and \(b\) such that

Expanding the left hand side we get

which gives us three equations when we compare coefficients

By inspection (i.e. just looking at it for a minute) we can see that \(a=7\) and \(b=-5\text{.}\)

Finally, one more observation. If we let \(e_2(x)=x^2\text{,}\) \(e_1(x)=x\text{,}\) and \(e_0(x)=1\text{,}\) then we can write every possible quadratic polynomial (every parabola) as a combination of \(e_2\text{,}\) \(e_1\text{,}\) and \(e_0\text{.}\) This means that we can define a set of elementary polynomials which we can combine to give us all the rest. We can form a basis of polynomials.

So, these quadratic polynomials can be added like vectors, they can be subtracted like vectors, they can be multiplied by scalars like vectors, they can be put together in linear combinations like vectors, and we can solve systems of equations with them just like vectors. Therefore, they are in a very real sense vectors.

### Subsection 2.1.3 It's Kinda All the Same

###### Definition 2.1.3.

a vector space is a pair of sets, vectors and scalars, together with a pair of binary operations, addition and scalar multiplication, that satisfy the following conditions: if \(v\text{,}\) \(u\text{,}\) and \(w\) are vectors and \(a\) and \(b\) are scalars, then

- \(a\,v +b\, v\) is another vector (closure)
- there exists a vector 0 such that \(v+0=0+v=v\) (additive identity)
- for each \(v\) there exists \(-v\) such that \(v+\, -v=-v+v=0\) (additive inverses)
- there exists a scalar 1 such that \(1\, v=v\) (multiplicative identity)
- \(v+u=u+v\) (commutative law)
- \((v+u)+w=v+(u+w)\) and \((ab)v=a(bv)\) (associative laws)
- \(a(u+v)=au+av\) and \((a+b)v=av+bv\) (distributive laws)

As observed above the coefficients in \(v(x)=2x^2+3x\) and \(u(x)=-x^2+5\) are the same as the entries in \(\vec{v}=[2,3,0]\) and \(\vec{u}=[-1,0,5]\) in Subsection 1.2.2 and so we can rewrite the system we solved above as a vector equation

which we can solve using techniques from the previous chapter. That is we can solve the problem as follows:

What this hopefully highlights is that any two sets that satisfy Definition 2.1.3 and in some sense are the same dimension then whatever we say about one we can say about the other. And, any visualization of one is, in a way, also a visualization of the other.

Every point on that plain can be written as a combination of \(\vec{v}=[2,3,0]\) and \(\vec{u}=[-1,0,5]\) and so equivalently could be the coefficients for a polynomial of the form \(f(x)=a\, v(x)+b\, u(x)\text{.}\) In particular we see that the point \((19,21,-15)\) is on the plane. But not all points are on the plain, for example we see that the point \((15,9,5)\) is not, this means there does not exist \(a\) and \(b\) such that \(a\, v(x)+b\, u(x)=15x^2+9x+5\) or \(a\vec{v}+b\vec{u}=[15,9,5]\text{.}\) In fact if we try to solve for \(a\) and \(b\) in the vector equation it doesn't work:

which implies \(0=8\text{,}\) which is nonsense. We say then that all the linear combinations of \(\vec{v}\) and \(\vec{u}\) form a subspace, that is they form a vector space of their own, but don't give you every possible point.

###### Section Vocabulary.

Vector Space, Subspace