# UMD Probability Qualifying Exams/Aug2009Probability

## Problem 1

 Let $X_{1},...,X_{n}$ be i.i.d. random variables with moment generating function $M(t)=E[\exp(tX_{1})]$ which is finite for all $t$ . Let ${\tilde {X}}_{n}=(X_{1}+\cdots +X_{n})/n$ . (a) Prove that $P[X_{1}>a]\leq exp[-h(a)]$ where $h(a)=\sup _{t\geq 0}[at-\psi (t)]$ and $\psi (t)=\log M(t)$ (b) Prove that $P[{\tilde {X}}_{n}\geq a]\leq \exp[-nh(a)]$ . (c) Assume $E[X_{1}]=0$ . Use the result of (b) to establish that ${\tilde {X}}_{n}\to 0$ almost surely.

### Solution

(a) {\begin{aligned}P[X_{1}>a]=&\int _{X_{1}>a}1\,dF=\int _{X_{1}>a}{\frac {\exp(ta)}{\exp(ta)}}\,dF\\=&e^{-at}\int _{X_{1}>a}e^{at}\,dF\leq e^{-at}\int _{X_{1}>a}e^{X_{1}t}\,dF\\\leq &e^{-at}\int _{\Omega }e^{X_{1}t}\,dF=e^{-at}E[\exp(tX_{1})]\end{aligned}}

Thus far, we have not imposed any conditions on $t$ . So the above inequality will hold for all $t$ , hence for the supremum as well, which gives us the desired result.

(b) {\begin{aligned}P[{\tilde {X}}_{n}>a]=&\int _{{\tilde {X}}_{n}>a}1\,dF=\int _{{\tilde {X}}_{n}>a}{\frac {\exp(nta)}{\exp(nta)}}\,dF\\=&e^{-nat}\int _{\sum _{i=1}^{n}X_{i}>an}e^{nat}\,dF\leq e^{-nat}\int _{\sum _{i=1}^{n}X_{i}>an}e^{\sum _{i=1}^{n}X_{i}t}\,dF\\\leq &e^{-nat}\int _{\Omega }e^{\sum _{i=1}^{n}X_{i}t}\,dF=e^{-nat}(\int _{\Omega }e^{X_{i}t}\,dF)^{n}\end{aligned}}  where the last equality follows from the fact that the $X_{i}$  are independent and identically distributed.

(c)

## Problem 2

 Let $(\Omega ,{\mathcal {F}},P)$ be a probability space; let $X$ be a random variable with finite second moment and let ${\mathcal {G}}_{1}\subset {\mathcal {G_{2}}}$ be sub $\sigma$ -fields. Prove that $E[(X-E(X|{\mathcal {G}}_{2}))^{2}]\leq E[(X-E(X|{\mathcal {G}}_{1}))^{2}].$ ## Problem 3

 Let $N_{1}(t),N_{2}(t)$ be independent homogeneous Poisson processes with rates $\lambda _{1},\lambda _{2}$ , respectively. Let $Z$ be the time of the first jump for the process $N_{1}(t)+N_{2}(t)$ and let $J$ be the random index of the component process that made the first jump. Find the joint distribution of $(J,Z)$ . In particular, establish that $J,Z$ are independent and that $Z$ is exponentially distributed.

### Solution

#### Show $Z$ is exponentially distributed

Let $\tau$  be the first time that a Poisson process $N(t)$  jumps.

{\begin{aligned}p_{\tau }(x)=\lim _{\epsilon \to 0}{\frac {F_{\tau }(x)-F_{\tau }(x-\epsilon )}{\epsilon }}&=\lim _{\epsilon \to 0}{\frac {P(N(x)>0\cap N(x-\epsilon )=0)}{\epsilon }}\\=&\lim _{\epsilon \to 0}1/\epsilon \,P(N(x-\epsilon )=0)\cdot P(N(x)-N(x-\epsilon )>0)\\=&\lim _{\epsilon \to 0}1/\epsilon \,e^{-\lambda (x-\epsilon )}\cdot {\frac {\lambda \epsilon }{1}}e^{-\lambda \epsilon }=\lambda e^{-\lambda x}\end{aligned}}

#### $N_{1}(t)+N_{2}(t)$ is a Poisson Process with parameter $\lambda _{1}+\lambda _{2}$ Proof: There are three conditions to check:

(i) $N_{1}(0)+N_{2}(0)=0$  almost surely

(ii) For $t>s$  is $(N_{1}(t)+N_{2}(t)-N_{1}(s)-N_{2}(s)$  independent of $N_{1}(s)+N_{2}(s)$ ? This is true since both $N_{1},N_{2}$  are Poisson Processes and are both independent of each other.

(iii) For $t>s$  is $(N_{1}(t)+N_{2}(t)-N_{1}(s)-N_{2}(s)$  distributed Poisson with parameter $t-s$ ? This is true since the sum of independent Poisson processes are also poison. (see second bullet)

#### Joint distribution of (J,Z)

$P(J=1,Z=x)=\lambda _{1}e^{-\lambda _{1}x}$

$P(J=2,Z=x)=\lambda _{2}e^{-\lambda _{2}x}$

## Problem 4

 Let $(X_{n},{\mathcal {F}}_{n})$ be a martingale sequence and for each $n$ let $\epsilon _{n}$ be an ${\mathcal {F}}_{n-1}$ -measurable random variable. Define $Y_{n}=\sum _{i=1}^{n}\epsilon _{i}(X_{i}-X_{i-1}),\quad Y_{0}=0$ Assuming that $Y_{n}$ is integrable for each $n$ , show that $Y_{n}$ is a martingale.

## Problem 5

 Let $X_{1},...,X_{n}$ be an i.i.d. sequence with $E[X_{i}]=0$ and $V[X_{i}]=\sigma ^{2}<\infty$ . Prove that for any $\gamma >1/2$ , the series $\sum _{k=1}^{\infty }X_{k}/k^{\gamma }$ converges almost surely.

### Solution

Define $Z_{k}:=X_{k}/k^{\gamma }$ . Then $E[Z_{k}]=0$  and $V[Z_{k}]={\frac {\sigma ^{2}}{k^{2\gamma }}}$ . We check the three components of Kolmogorov's three-series theorem to conclude that $\sum _{k=1}^{\infty }Z_{k}$  converges almost surely.

## Problem 6

 Consider the following process $\{X_{n}\}$ taking values in $\{0,1,...\}$ . Assume $U_{n},n=1,2,...$ is an i.i.d. sequence of positive integer valued random variables and let $X_{0}$ be independent of the $U_{n}$ . Then $X_{n}=\left\{{\begin{array}{l l}X_{n-1}-1&{\text{if }}X_{n-1}\neq 0\\U_{k}-1&{\text{if }}X_{n-1}=0{\text{ for the kth time}}\end{array}}\right.$ 