Last modified on 22 April 2015, at 09:34

# Real Analysis/Power Series

Please contribute to this section, it has been neglected. I have made a start, but it is not finished.

## DefinitionEdit

We have already encountered series. A power series is a series of the form

$f(x)=\sum_{n=0}^\infty a_n x^n$ where $a_n \in \mathbb{R} \ \forall n\in \mathbb{N}$.

$f$ can also be seen as a real valued function. One of the first questions we will attempt to answer in this section is for which values of $x$ is $f$ is convergent.

## ConvergenceEdit

We can use the root test here. We can see that

$\limsup_{n\to\infty} {\left\vert a_n x^n \right\vert}^{1/n} = \vert x \vert \limsup_{n\to\infty} {\left\vert a_n \right\vert}^{1/n}=\vert x \vert R$

from before, so we see that there is a radius $r = 1/R$ such that $f$ converges for $\vert x \vert < r$ and diverges for $\vert x \vert > r$. This radius has a special significance in Complex analysis, but we will not be concerned with that here.

## DifferentiabilityEdit

We can see by the root test that if $\sum_{n=0}^\infty a_n x^n$ is convergent, then $\sum_{n=0}^\infty (n+1)a_{n+1} x^n$ is convergent, as $\limsup_{n\to\infty} \vert (n+1)a_{n+1} \vert^{1/n}= \limsup_{n\to\infty} \vert a_{n} \vert^{1/n}$, since $\lim_{n\to\infty} (n+1)^{1/n}=1$, so this power series has the same radius of convergence as the original power series. Intuitively, we would guess that this was the derivative of it, but that requires proof. We look at the newton quotient:

$f^{\prime} (x) = \lim_{h \to 0} \frac{\sum_{n=0}^\infty a_n (x+h)^n - \sum_{n=0}^\infty a_n x^n}{h}$

$f^{\prime} (x) = \lim_{h \to 0} \frac{\sum_{n=0}^\infty a_n (x+h)^n - a_n x^n}{h}$

$f^{\prime} (x) = \lim_{h \to 0} \frac{\sum_{n=0}^\infty a_n (x^n + nhx^{n-1} +o(h^2) - x^n)}{h}$

$f^{\prime} (x) = \sum_{n=0}^\infty na_n x^n-1 = \sum_{n=0}^\infty (n+1)a_{n+1} x^n$

## Taylor SeriesEdit

One use of power series is to approximate functions. We can see that $f(0)=a_0$, so if a power series $f$ is a good approximation for $g$, then $a_0 = g(0)$.

We can also see from [[#Differentiability|]] that we need $1.f^\prime (0)=a_1$, and that $2.f^{\prime\prime} (0)=a_2$, and by induction that $n!.f^{(n)} (0)=a_n$, so we define the Taylor series of $f$ as

$\sum_{n=0}^\infty \frac{f^{(n)}(0)}{n!} x^n$

By translation, we can also approximate $f$ by

$\sum_{n=0}^\infty \frac{f^{(n)}(t)}{n!} (x-t)^n$

To do this, we of course require that $f$ is differentiable an infinite number of times at o or t.