Calculus/Series

Introduction

A series is the sum of a sequence of terms. An infinite series is the sum of an infinite number of terms (the actual sum of the series need not be infinite, as we will see below).

An arithmetic series is the sum of a sequence of terms with a common difference (the difference between consecutive terms). For example:

${\displaystyle 1+4+7+10+13+\cdots }$

is an arithmetic series with common difference 3, since ${\displaystyle a_{2}-a_{1}=3}$ , ${\displaystyle a_{3}-a_{2}=3}$ , and so forth.

A geometric series is the sum of terms with a common ratio. For example, an interesting series which appears in many practical problems in science, engineering, and mathematics is the geometric series ${\displaystyle r+r^{2}+r^{3}+r^{4}+\cdots }$  where the ${\displaystyle \cdots }$  indicates that the series continues indefinitely. A common way to study a particular series (following Cauchy) is to define a sequence consisting of the sum of the first ${\displaystyle n}$  terms. For example, to study the geometric series we can consider the sequence which adds together the first n terms:

${\displaystyle S_{n}(r)=\sum _{i=1}^{n}r^{i}}$

Generally by studying the sequence of partial sums we can understand the behavior of the entire infinite series.

Two of the most important questions about a series are:

• Does it converge?
• If so, what does it converge to?

For example, it is fairly easy to see that for ${\displaystyle r>1}$ , the geometric series ${\displaystyle S_{n}(r)}$  will not converge to a finite number (i.e., it will diverge to infinity). To see this, note that each time we increase the number of terms in the series, ${\displaystyle S_{n}(r)}$  increases by ${\displaystyle r^{n+1}}$ , since ${\displaystyle r^{n+1}>1}$  for all ${\displaystyle r>1}$  (as we defined), ${\displaystyle S_{n}(r)}$  must increase by a number greater than one every term. When increasing the sum by more than one for every term, it will diverge.

Perhaps a more surprising and interesting fact is that for ${\displaystyle |r|<1}$ , ${\displaystyle S_{n}(r)}$  will converge to a finite value. Specifically, it is possible to show that

${\displaystyle \lim _{n\to \infty }S_{n}(r)={\frac {r}{1-r}}}$

Indeed, consider the quantity

${\displaystyle (1-r)S_{n}(r)=(1-r)\sum _{i=1}^{n}r^{i}=\sum _{i=1}^{n}r^{i}-\sum _{i=2}^{n+1}r^{i}=r-r^{n+1}}$

Since ${\displaystyle r^{n+1}\to 0}$  as ${\displaystyle n\to \infty }$  for ${\displaystyle |r|<1}$ , this shows that ${\displaystyle (1-r)S_{n}(r)\to r}$  as ${\displaystyle n\to \infty }$ . The quantity ${\displaystyle 1-r}$  is non-zero and doesn't depend on ${\displaystyle n}$  so we can divide by it and arrive at the formula we want.

We'd like to be able to draw similar conclusions about any series.

Unfortunately, there is no simple way to sum a series. The most we will be able to do in most cases is determine if it converges. The geometric and the telescoping series are the only types of series we can easily find the sum of.

Convergence

It is obvious that for a series to converge, the ${\displaystyle a_{n}}$  must tend to zero (because sum of an infinite number of terms all greater than any given positive number will be infinity), but even if the limit of the sequence is 0, this is not sufficient to say it converges.

Consider the harmonic series, the sum of ${\displaystyle {\frac {1}{n}}}$ , and group terms

${\displaystyle {\begin{matrix}\sum \limits _{n=1}^{2^{m}}{\frac {1}{n}}&=&1+{\frac {1}{2}}&+&{\frac {1}{3}}+{\frac {1}{4}}&+&{\frac {1}{5}}+{\frac {1}{6}}+{\frac {1}{7}}+{\frac {1}{8}}&+\ \cdots &+&\sum \limits _{p=1+2^{n-1}}^{2^{n}}{\frac {1}{p}}\\\\&>&{\frac {3}{2}}&+&{\frac {2}{4}}&+&{\frac {4}{8}}&+\ \cdots &+&{\frac {2^{n-1}}{2^{n}}}\\\\&=&{\frac {3}{2}}&+&{\frac {1}{2}}&+&{\frac {1}{2}}&+\ \cdots &+&{\frac {1}{2}}\end{matrix}}}$

This final sum contains m terms. As m tends to infinity, so does the sum, hence the series diverges.

We can also deduce something about how quickly it diverges. Using the same grouping of terms, we can get an upper limit on the sum of the first so many terms, the partial sums.

${\displaystyle 1+{\frac {m}{2}}\leq \sum _{n=1}^{2^{m}}{\frac {1}{n}}\leq 1+m}$

or

${\displaystyle 1+{\frac {\log _{2}(m)}{2}}\leq \sum _{n=1}^{m}{\frac {1}{n}}\leq 1+\log _{2}(m)}$

and the partial sums increase like ${\displaystyle \log _{2}(m)}$ , very slowly.

Comparison test

The argument above, based on considering upper and lower bounds on terms, can be modified to provide a general-purpose test for convergence and divergence called the comparison test (or direct comparison test). It can be applied to any series with nonnegative terms:

• If ${\displaystyle \sum b_{n}}$  converges and ${\displaystyle 0\leq a_{n}\leq b_{n}}$ , then ${\displaystyle \sum a_{n}}$  converges.
• If ${\displaystyle \sum b_{n}}$  diverges and ${\displaystyle 0\leq b_{n}\leq a_{n}}$ , then ${\displaystyle \sum a_{n}}$  diverges.

There are many such tests for convergence and divergence, the most important of which we will describe below.

Absolute convergence

Theorem: If the series of absolute values, ${\displaystyle \sum _{n=1}^{\infty }|a_{n}|}$ , converges, then so does the series ${\displaystyle \sum _{n=1}^{\infty }a_{n}}$

We say such a series converges absolutely.

Proof:

Let ${\displaystyle \epsilon >0}$

According to the Cauchy criterion for series convergence, exists ${\displaystyle N}$  so that for all ${\displaystyle N  :

${\displaystyle \sum _{k=n}^{m}|a_{k}|<\epsilon }$

We know that:

${\displaystyle \left|\sum _{k=n}^{m}a_{k}\right|\leq \sum _{k=n}^{m}|a_{k}|}$

And then we get:

${\displaystyle \left|\sum _{k=n}^{m}a_{k}\right|\leq \sum _{k=n}^{m}|a_{k}|<\epsilon }$

Now we get:

${\displaystyle \left|\sum _{k=n}^{m}a_{k}\right|<\epsilon }$

Which is exactly the Cauchy criterion for series convergence.

${\displaystyle Q.E.D}$

The converse does not hold. The series ${\displaystyle 1-{\frac {1}{2}}+{\frac {1}{3}}-{\frac {1}{4}}+\cdots }$  converges, even though the series of its absolute values diverges.

A series like this that converges, but not absolutely, is said to converge conditionally.

If a series converges absolutely, we can add terms in any order we like. The limit will still be the same.

If a series converges conditionally, rearranging the terms changes the limit. In fact, we can make the series converge to any limit we like by choosing a suitable rearrangement.

E.g., in the series ${\displaystyle 1-{\frac {1}{2}}+{\frac {1}{3}}-{\frac {1}{4}}+\cdots }$ , we can add only positive terms until the partial sum exceeds 100, subtract 1/2, add only positive terms until the partial sum exceeds 100, subtract 1/4, and so on, getting a sequence with the same terms that converges to 100.

This makes absolutely convergent series easier to work with. Thus, all but one of convergence tests in this chapter will be for series all of whose terms are positive, which must be absolutely convergent or divergent series. Other series will be studied by considering the corresponding series of absolute values.

Ratio test

For a series with terms ${\displaystyle a_{n}}$ , if

${\displaystyle \lim _{n\to \infty }\left|{\frac {a_{n+1}}{a_{n}}}\right|=r}$

then

• the series converges (absolutely) if ${\displaystyle r<1}$
• the series diverges if ${\displaystyle r>1}$  (or if ${\displaystyle r}$  is infinity)
• the series could do either if ${\displaystyle r=1}$ , so the test is not conclusive in this case.

E.g., suppose

${\displaystyle a_{n}={\frac {n!n!}{(2n)!}}}$

then

${\displaystyle {\frac {a_{n+1}}{a_{n}}}={\frac {(n+1)^{2}}{(2n+1)(2n+2)}}={\frac {n+1}{4n+2}}\to {\frac {1}{4}}}$

so this series converges.

Integral test

If ${\displaystyle f(x)}$  is a monotonically decreasing, always positive function, then the series

${\displaystyle \sum _{n=1}^{\infty }f(n)}$

converges if and only if the integral

${\displaystyle \int _{1}^{\infty }f(x)\,dx}$

converges.

E.g., consider ${\displaystyle f(x)={\frac {1}{x^{p}}}}$ , for a fixed ${\displaystyle p}$ .

• If ${\displaystyle p=1}$  this is the harmonic series, which diverges.
• If ${\displaystyle p<1}$  each term is larger than the harmonic series, so it diverges.
• If ${\displaystyle p>1}$  then
 ${\displaystyle \int _{1}^{\infty }x^{-p}dx}$ ${\displaystyle =\lim _{s\to \infty }\int _{1}^{s}x^{-p}dx}$ ${\displaystyle =\lim _{s\to \infty }-{\frac {1}{(p-1)x^{p-1}}}{\Bigg |}_{1}^{s}}$ ${\displaystyle =\lim _{s\to \infty }\left({\frac {1}{p-1}}-{\frac {1}{(p-1)s^{p-1}}}\right)}$ ${\displaystyle ={\frac {1}{p-1}}}$

The integral converges, for ${\displaystyle p>1}$ , so the series converges.

We can prove this test works by writing the integral as

${\displaystyle \int _{1}^{\infty }f(x)\,dx=\sum _{n=1}^{\infty }\int _{n}^{n+1}f(x)\,dx}$

and comparing each of the integrals with rectangles, giving the inequalities

${\displaystyle f(n)\geq \int _{n}^{n+1}f(x)\,dx\geq f(n+1)}$

Applying these to the sum then shows convergence.

Limit comparison test

Given an infinite series ${\displaystyle \sum a_{n}}$  with positive terms only, if one can find another infinite series ${\displaystyle \sum b_{n}}$  with positive terms for which

${\displaystyle \lim _{n\to \infty }{\frac {a_{n}}{b_{n}}}=L}$

for a positive and finite ${\displaystyle L}$  (i.e., the limit exists and is not zero), then the two series either both converge or both diverge. That is,

• ${\displaystyle \sum a_{n}}$  converges if ${\displaystyle \sum b_{n}}$  converges, and
• ${\displaystyle \sum a_{n}}$  diverges if ${\displaystyle \sum b_{n}}$  diverges.

Example:

${\displaystyle a_{n}=n^{-{\frac {n+1}{n}}}}$

For large ${\displaystyle n}$ , the terms of this series are similar to, but smaller than, those of the harmonic series. We compare the limits.

${\displaystyle \lim {\frac {a_{n}}{b_{n}}}=\lim {\frac {n^{-{\frac {n+1}{n}}}}{\frac {1}{n}}}=\lim {\frac {n}{n^{\frac {n+1}{n}}}}=\lim {\frac {1}{n^{\frac {1}{n}}}}=1>0}$

so this series diverges.

Alternating series

Given an infinite series ${\displaystyle \sum a_{n}}$ , if the signs of the ${\displaystyle a_{n}}$  alternate, that is if

${\displaystyle a_{n}=(-1)^{n}|a_{n}|}$

for all n or

${\displaystyle a_{n}=(-1)^{n+1}|a_{n}|}$

for all ${\displaystyle n}$ , then we call it an alternating series.

The alternating series test states that such a series converges if

${\displaystyle \lim _{n\to \infty }a_{n}=0}$

and

${\displaystyle {\bigl |}a_{n+1}{\bigr |}<|a_{n}|}$

(that is, the magnitude of the terms is decreasing).

Note that this test cannot lead to the conclusion that the series diverges; if one cannot conclude that the series converges, this test is inconclusive, although other tests may, of course, be used to give a conclusion.

Estimating the sum of an alternating series

The absolute error that results in using a partial sum of an alternating series to estimate the final sum of the infinite series is smaller than the magnitude of the first omitted term.

${\displaystyle \left|\sum _{n=1}^{\infty }a_{n}-\sum _{n=1}^{m}a_{n}\right|<{\bigl |}a_{m+1}{\bigr |}}$

Geometric series

The geometric series can take either of the following forms

${\displaystyle \sum _{n=0}^{\infty }ar^{n}}$  or ${\displaystyle \sum _{n=1}^{\infty }ar^{n-1}}$

As you have seen at the start, the sum of the geometric series is

${\displaystyle s=\lim _{n\to \infty }S_{n}=\lim _{n\to \infty }{\frac {a(1-r^{n})}{1-r}}={\frac {a}{1-r}}\quad {\mbox{ for }}|r|<1}$ .

Telescoping series

${\displaystyle \sum _{n=0}^{\infty }(b_{n}-b_{n+1})}$

Expanding (or "telescoping") this type of series is informative. If we expand this series, we get:

${\displaystyle \sum _{n=0}^{k}(b_{n}-b_{n+1})=(b_{0}-b_{1})+(b_{1}-b_{2})+\cdots +(b_{k-1}-b_{k})}$

${\displaystyle \sum _{n=0}^{k}(b_{n}-b_{n+1})=b_{0}-b_{k}}$
${\displaystyle \sum _{n=0}^{\infty }(b_{n}-b_{n+1})=\lim _{k\to \infty }\sum _{n=0}^{k}(b_{n}-b_{n+1})=\lim _{k\to \infty }(b_{0}-b_{k})=b_{0}-\lim _{k\to \infty }b_{k}}$