# UMD Probability Qualifying Exams/Jan2010Probability

## Problem 1

 Let ${\displaystyle \{X_{nk}\},k=1,...,r_{n},n=1,2,...}$  be a triangular array of Bernoulli random variables with ${\displaystyle p_{nk}=P[X_{nk}=1]}$ . Suppose that ${\displaystyle \sum _{k=1}^{r_{n}}p_{nk}\to \lambda \,{\text{ and }}\,\max _{k\leq r_{n}}p_{nk}\to 0.}$  Find the limiting distribution of ${\displaystyle \sum _{k=1}^{r_{n}}X_{nk}}$ .

### Solution

We will show it converges to a Poisson distribution with parameter ${\displaystyle \lambda }$ . The characteristic function for the Poisson distribution is ${\displaystyle e^{\lambda (e^{it}-1)}}$ . We show the characteristic function, ${\displaystyle E[\exp(it\sum _{k=1}^{r_{n}}X_{nk})]}$  converges to ${\displaystyle e^{\lambda (e^{it}-1)}}$ , which implies the result.

${\displaystyle \log E[\exp(it\sum _{k=1}^{r_{n}}X_{nk})]=\sum _{k=1}^{r_{n}}\log((1-p_{nk})+p_{nk}e^{it})=\sum _{k=1}^{r_{n}}\log(1-p_{nk}(1-e^{it}))=\sum _{k=1}^{r_{n}}(-p_{nk}(1-e^{it})+O(p_{nk}^{2}))}$ . By our assumptions, this converges to ${\displaystyle \lambda (e^{it}-1)}$ .

## Problem 2

 Let ${\displaystyle X_{1},X_{2},...}$  be a sequence of i.i.d. random variables with uniform distribution on ${\displaystyle [0,1]}$ . Prove that ${\displaystyle \lim _{n\to \infty }(X_{1}X_{2}\cdots X_{n})^{1/n}}$  exists with probability one and compute its value.

### Solution

Let ${\displaystyle Y_{n}=(X_{1}X_{2}\cdots X_{n})^{1/n}}$ .

${\displaystyle \log(Y_{n})={\frac {1}{n}}\sum _{j=1}^{n}\log(X_{j})}$ .

The random variables ${\displaystyle \log(X_{j})}$  are i.i.d. with finite mean,

${\displaystyle E[\log(X_{j})]=\int _{0}^{1}\log(t)dt=-1}$ .

Therefore, the strong law of large numbers implies ${\displaystyle {\frac {1}{n}}\sum _{j=1}^{n}\log(X_{j})}$  converges with probability one to ${\displaystyle -1}$ .

So almost surely, ${\displaystyle \log(Y_{n})}$  converges to ${\displaystyle -1}$  and ${\displaystyle Y_{n}}$  converges to ${\displaystyle e^{-1}}$ .

## Problem 3

 Let ${\displaystyle \{X_{n}|n=0,1,2,...\}}$  be a square integrable martingale with respect to a nested sequence of ${\displaystyle \sigma }$ -fields ${\displaystyle \{{\mathcal {F}}_{n}\}}$ . Assume ${\displaystyle E[X_{n}]=0}$ . Prove that ${\displaystyle P[\max _{1\leq k\leq n}|X_{k}|>\epsilon ]\leq E[X_{n}^{2}]/\epsilon ^{2}}$ .

### Solution

Since ${\displaystyle X_{n}}$  is a martingale, ${\displaystyle |X_{n}|}$  is a non-negative submartingale and ${\displaystyle E[|X_{n}|^{2}]<\infty }$  since ${\displaystyle X_{n}}$  is square integrable. Thus ${\displaystyle |X_{n}|}$  meets the conditions for Doob's Martingale Inequality and the result follows.

## Problem 4

 The random variable ${\displaystyle X}$  is defined on a probability space ${\displaystyle (\Omega ,{\mathcal {F}},P)}$ . Let ${\displaystyle {\mathcal {G}}_{1}\subset {\mathcal {G}}_{2}\subset {\mathcal {F}}}$  and assume ${\displaystyle X}$  has finite variance. Prove that ${\displaystyle E[(X-E[X|{\mathcal {G}}_{2}])^{2}]\leq E[(X-E[X|{\mathcal {G}}_{1}])^{2}].}$  In words, the dispersion of ${\displaystyle X}$  about its conditional mean becomes smaller as the ${\displaystyle \sigma }$ -field grows.

### Solution

${\displaystyle E[(X-E[X|{\mathcal {G}}_{1}])^{2}]=E[((X-E[X|{\mathcal {G}}_{2}])+(E[X|{\mathcal {G}}_{2}]-E[X|{\mathcal {G}}_{1}]))^{2}]}$  ${\displaystyle =E[(X-E[X|{\mathcal {G}}_{2}])^{2}]+E[(E[X|{\mathcal {G}}_{2}]-E[X|{\mathcal {G}}_{1}])^{2}]+2E[(X-E[X|{\mathcal {G}}_{2}])(E[X|{\mathcal {G}}_{2}]-E[X|{\mathcal {G}}_{1}])]}$

We will show that the third term vanishes. Then since the second term is nonnegative, the result follows.

${\displaystyle E[(X-E[X|{\mathcal {G}}_{2}])(E[X|{\mathcal {G}}_{2}]-E[X|{\mathcal {G}}_{1}])]=E[E[(X-E[X|{\mathcal {G}}_{2}])(E[X|{\mathcal {G}}_{2}]-E[X|{\mathcal {G}}_{1}])|{\mathcal {G}}_{2}]]}$  by the law of total probability.

${\displaystyle E[(X-E[X|{\mathcal {G}}_{2}])(E[X|{\mathcal {G}}_{2}]-E[X|{\mathcal {G}}_{1}])|{\mathcal {G}}_{2}]=(E[X|{\mathcal {G}}_{2}]-E[X|{\mathcal {G}}_{1}])E[(X-E[X|{\mathcal {G}}_{2}])|{\mathcal {G}}_{2}]}$ , since ${\displaystyle (E[X|{\mathcal {G}}_{2}]-E[X|{\mathcal {G}}_{1}])}$  is ${\displaystyle {\mathcal {G}}_{2}}$ -measurable.

Finally, ${\displaystyle E[(X-E[X|{\mathcal {G}}_{2}])|{\mathcal {G}}_{2}]=E[X|{\mathcal {G}}_{2}]-E[E[X|{\mathcal {G}}_{2}]|{\mathcal {G}}_{2}]=E[X|{\mathcal {G}}_{2}]-E[X|{\mathcal {G}}_{2}]=0}$

## Problem 5

 Consider a sequence of random variables ${\displaystyle X_{1},X_{2},\ldots }$  such that ${\displaystyle X_{n}=1{\text{ or }}0}$ . Assume ${\displaystyle P[X_{1}=1]\geq \alpha }$  and ${\displaystyle P[X_{n}=1|X_{1},\ldots ,X_{n-1}]\geq \alpha >0{\text{ for n=2,3,}}\ldots }$  Prove that (a.) ${\displaystyle P[X_{n}=1{\text{ for some n}}]=1.}$  (b). ${\displaystyle P[X_{n}=1{\text{ infinitely often}}]=1.}$

### Solution

We show ${\displaystyle P[X_{n}=1{\text{ finitely often}}]=0.}$ . If ${\displaystyle X_{n}=1}$  for only finitely many ${\displaystyle n}$ , then there is a largest index ${\displaystyle T}$  for which ${\displaystyle X_{T}=1}$ . We show in contrast that for all ${\displaystyle T}$ , ${\displaystyle P[X_{n}=0{\text{ for all }}n\geq T]=0}$ .

First notice, ${\displaystyle P[X_{1}=0]\leq (1-\alpha )}$  and ${\displaystyle P[X_{T}=0]=E[P[X_{T}=0|X_{1},X_{2},\ldots ,X_{T-1}]]\leq (1-\alpha ){\text{ for T}}>1}$ .

Then let ${\displaystyle A_{n}^{(T)}}$  be the event ${\displaystyle [X_{T+n-1}=\ldots =X_{T}=0]}$ , then ${\displaystyle P[X_{n}=0{\text{ for all }}n\geq T]=P[A_{n}^{(T)}{\text{ occurs for all n}}]}$ .

Notice ${\displaystyle P[A_{n}^{(T)}]=P[X_{T+n-1}=0|A_{n-1}^{(T)}]P[A_{n-1}^{(T)}]\leq (1-\alpha )P[A_{n-1}^{(T)}]{\text{ for n =2,3,}}\ldots }$  and ${\displaystyle P[A_{1}^{(T)}]=P[X_{T}=0]\leq (1-\alpha )}$ . Therefore ${\displaystyle P[A_{n}^{(T)}]\leq (1-\alpha )^{n}}$  and ${\displaystyle \lim _{n\rightarrow \infty }P[A_{n}^{(T)}]=0}$ . So ${\displaystyle P[X_{n}=0{\text{ for all n}}\geq T]=0}$  and we reach the desired conclusion.

## Problem 6

 Let ${\displaystyle \{N(t):t\geq 0\}}$  be a nonhomogeneous Poisson process. That is, ${\displaystyle N(0)=0}$  a.s., ${\displaystyle N(t)}$  has independent increments, and ${\displaystyle N(t)-N(s)}$  has a Poisson distribution with parameter ${\displaystyle \int _{s}^{t}\lambda (u)du}$  where ${\displaystyle 0\leq s\leq t}$  and the rate function ${\displaystyle \lambda (u)}$  is a continuous positive function. (a.) Find a continuous strictly increasing function ${\displaystyle h(t)}$  such that the time-transformed process ${\displaystyle {\tilde {N}}(t)=N(h(t))}$  is a homogeneous Poisson process with rate parameter 1. (b.) Let ${\displaystyle T}$  be the time until the first event in the nonhomogeneous process ${\displaystyle N(t)}$ . Compute ${\displaystyle P[T>t]}$  and ${\displaystyle P[T>t|N(s)=n]{\text{ where }}s>t}$