Solutions To Mathematics Textbooks/Principles of Mathematical Analysis (3rd edition) (ISBN 0070856133)/Chapter 6

Chapter 6 edit

7 edit

a edit

By page 121 we know that   must be bounded, say by  . We need to show that given   we can find some   such that  So, by Theorem 6.12 (c) we have   and  .

Hence,   but since we can choose any   and   is fixed we can choose   which yields   So, given   we can always choose a   such that   as desired.

b edit

Considered the function which is defined to be   on the last   of the interval [0,1] and zero at those   where  . This function is well defined, since we know that  .

More specifically the function has value   on the open interval from  

First we evaluate the integral of the function itself. Consider a partitioning of the interval   at each   for some  

Then, the lower and upper sums corresponding to the intervals of the partition from   to   are the same, since the function is constant valued on these intervals. Moreover, as   the value of the upper and lower sums both approach  .

Thus we can express the value of the integral as the sum of the series     but we recognize this sum as just a constant multiple of the alternating harmonic series. Hence, the integral converges.

Now we examine the integral of the absolute value of the function. We argue similarly to the above, again partitioning the function at   as defined above. The difference is that now, as we let   the upper and lower sums both go to     and so the integral does not exist, as this is the harmonic series, which does not converge.

In the above proof of divergence the important point is that the lower sums diverge. The fact that the upper sums diverge is an immediate consequence of this.

So, we have demonstrated a function whose integral converges, but does not converge absolutely as desired.

8 edit

We begin by showing (  converges if   converges.

So, we assume to start that   converges. Now consider the partition  . Since   decreases monotonically it must be that   and similarly that  . Thus, the integral which we are trying to evaluate is bounded above by   and below by  .

Now we observe that   may be written as a sum over the domain as   We know moreover that each of these integrals exist, by Theorem 6.9. Also, since   is always positive each such integral must be positive. Therefore, the integral may be expressed as a sum of a nonnegative series which is bounded above. Hence, by Theorem 3.24 the integral exists.

Now we prove ( ) that if   converges then   converges.

So assume now that   converges. Then we can prove that the summation   satisfies the Cauchy criterion. We established above   is bounded above by   and below by  . This implies that given a sum   it is bounded above by the integral  . Moreover, since the integral   exists and   is nonnegative we know that it has the property given   such that  . For otherwise the integral would not exist and instead tend to infinity.

So now we can apply the Cauchy criterion for series. Since an upper bound of the series has the property that given   such that  . So must the series itself have this property.

Thus, the sum converges as desired.

10 edit

a edit

We will prove that If   and   then   and that equality holds if and only if   \begin{proof} We begin by proving the special case of equality

Assume that  .           (Similarly we can show that  .) Thus,   and we see moreover that   since in this case we have   Also, if it is not the case that   then it is easy to see that   as for a sum of quotients by   and   to not contain  ,   we must have the numerators equal.


Now we show that as we vary   we must always have  . For, compute the derivative of   with respect to  , and the derivative of   with respect to  . We get   and   respectively. If we have   then these are equal as demonstrated above (we showed that   in that case). In the case that   is larger than this value then   and in the case that   is less than this value then  .

This argument can be repeated in an analogous manner for variations in  , and given any   and   we can find values for which  .

Thus, we observe that   as desired\end{proof}

b edit

If  ,  ,  ,  , and   then   \begin{proof}

If   and   then   and   are in   by Theorem 6.11. Also, we have   so we get   as desired.\end{proof}


c edit

We prove H\"older's inequality \begin{proof} If   and   are complex valued then we get  

If  and   then applying the previous part to the functions   and   where   and   gives what we wanted to show.

 

However, if one of the above is zero (say without loss of generality   then we just have   for  . Taking the limit   we observe that the inequality is still true.

 

\end{proof}

16 edit

\begin{enumerate}

a edit

We take the expression   and express it as a sum of integrals on the intervals   to get   but since each such interval   is the same, we just write  (1)

Now we exploit the Fundamental Theorem of Calculus, computing   So, the summation in Equation 1 can, more explicitly be written as   However, grouping common denominators, we observe that the sum partially telescopes to yield more simply  

b edit

Having now proved Part a it suffices to show that  


By the Fundamental Theorem of Calculus we have   So \begin{eqnarray} \int_1^\infty \frac{x}{x^{s+1}} dx&=&\frac{1}{s-1}\\ \Rightarrow s \int_1^\infty \frac{x}{x^{s+1}} dx&=&\frac{s}{s-1}\\ \Rightarrow s \int_1^\infty \left( \frac{x-[x]}{x^{s+1}} + \frac{[x]}{x^{s+1}} \right) dx&=&\frac{s}{s-1}\\ \Rightarrow s \int_1^\infty \left( \frac{x-[x]}{x^{s+1}} + \frac{[x]}{x^{s+1}} \right) dx&=&\frac{s}{s-1}\\ \Rightarrow s \int_1^\infty \frac{[x]}{x^{s+1}} dx &=&\frac{s}{s-1} - s \int_1^\infty \frac{x-[x]}{x^{s+1}} dx\\ \end{eqnarray*} as desired\

end part b

It remains now to show that the integral in Part \ref{2} converges.

Since for   we know that   converges if and only if   converges.

However,   converges by the integral test (Problem 8) since we have already shown that the sequence   is convergent for