# UMD Probability Qualifying Exams/Jan2015Probability

## Problem 1

 A die is rolled infinitely many times. (a) Find the expected number of combinations (6,6) among the first 1000 rolls. Find the expected number of combinations (1,2) among the first 1000 rolls. (b) What is larger and by how much: the expected number of rolls till the first occurrence of (6,6) or the expected number of rolls till the first occurrence of (1,2)? (c) What is the probability that (6,6) occurs before (1,2)?

### Solution

(a) Let $X_{n}$  be the value of the nth roll. The number of (6,6)'s in the first 1000 rolls is given by $\sum _{n=1}^{999}\mathbb {I} (X_{n}=6{\text{ and }}X_{n+1}=6)$

By linearity of expectation,

$E[\sum _{n=1}^{999}\mathbb {I} (X_{n}=6{\text{ and }}X_{n+1}=6)]=\sum _{n=1}^{999}E[\mathbb {I} (X_{n}=6{\text{ and }}X_{n+1}=6)]$

$=\sum _{n=1}^{999}\mathbb {P} (X_{n}=6{\text{ and }}X_{n+1}=6)=999/36$

Similarly, the number of (1,2)'s in the first 1000 rolls is 999/36.

(b) For the general approach to solve such problems see . The expected waiting time till the first (6,6) is 6+36=42, while the expected waiting time till the first (1,2) is 0+36=36.

The argument is as follows. Suppose we continue rolling dice until we hit the desired pattern. At each time n a new gambler begins a game with 1 coin. On a gambler's i-th bet, he bets the i-th character of the pattern will occur. If the gambler loses a bet, he forfeits all his coins. Otherwise he has 6 times as many as he started with. This is a martingale, so at the end of the game, by the optional stopping theorem the expected total sum of all gamblers' coins is equal to the number of gamblers who joined the game, i.e. the expected waiting time for the pattern.

For the pattern (6,6), the second to last gambler has 6 coins and the last gambler has 36 coins. The others have 0.

For the pattern (1,2), the last gambler has 36 coins. The others have 0.

• Note the optional stopping theorem applies because the expected value of the stopping time is finite and the martingale increments are bounded.
• A more direct way to do this problem is with absorbing Markov chains

(c) Since the patterns do not overlap, the odds that (6,6) occurs first are given by the expected waiting times, 36 : 42. That is the probability (6,6) occurs first is 36/(42+36)=6/13

For patterns which do overlap, the analysis is slightly more complicated, see Corollary 3.2 in the Shuo-Yen Li paper .

 A Martingale Approach to the Study of Occurrence of Sequence Patterns in Repeat Experiments, Shuo-Yen Li, Annals of Probability

## Problem 2

 We'll say that a random variable is discrete if the measure it induces is discrete. A random variable is absolutely continuous if the measure it induces is absolutely continuous. Prove or disprove. (a) A sum of two discrete random variables is always a discrete random variable (b) A sum of a discrete and an absolutely continuous random variable is always an absolutely continuous random variable. (c) A sum of two absolutely continuous random variables is always an absolutely continuous random variable. (d) A sum of two independent discrete random variables is always a discrete random variable. (e) A sum of a discrete and an absolutely continuous random variable that are independent is always an absolutely continuous random variable. (f) A sum of two independent absolutely continuous random variables is always an absolutely continuous random variable.

### Solution

(a) True. A random variable is discrete iff its range is countable. The range of the sum of two random variables A, B is a subset of the sum-set range(A)+range(B). The sum-set is countable if range(A), and range(B) are countable.

(b). True. A random variable X is absolutely continuous iff $\mathbb {P} ({\text{X in E}})=0$  for any Lebesgue-null set E. Let Y be a discrete random variable. $\mathbb {P} (X+Y{\text{ in E}})\leq \mathbb {P} (X{\text{ in E}}-{\text{range(Y)}})=0$  since a countable sum of Lebesgue-null sets is Lebesgue-null.

(c). False. For any absolutely continuous random variable X, -X is also absolutely continuous, and X + (-X) =0 which is a discrete random variable.

(d). True, by (a)

(e). True, by (b)

(f). True, the distribution function is given by the convolution theorem.

## Problem 3

 Suppose that $\xi _{1},\xi _{2},\ldots$ are independent random variables uniformly distributed on the intervals $[-a_{1},a_{1}],[-a_{2},a_{2}],\ldots ,$ respectively, where $a_{1},a_{2},\ldots$ are positive. Let $\eta _{n}=\prod _{i=1}^{n}\xi _{i}$ (a). Prove that $\eta _{n}$ tend to zero almost surely if $\lim \sup _{n\rightarrow \infty }a_{n}\leq 2$ . (b). Prove that $\eta _{n}$ do not have a limit if $\lim _{n\rightarrow \infty }(a_{n}/n^{2})=c>0$ .

### Solution

(a). Let $\xi _{i}=\pm a_{i}X_{i}$  where $X_{i}$  is uniformly distributed in $[0,1]$ .

$\log |\eta _{n}|=\sum _{i=1}^{n}{(\log a_{i}+\log X_{i}})$

$E[\log X_{i}]=\int _{0}^{1}\log xdx=-1$

By the strong law off large numbers,

${\frac {1}{n}}\sum _{i=1}^{n}\log X_{i}$  converges to -1 almost surely.

$\lim \sup _{n\rightarrow \infty }a_{n}\leq 2\implies \lim \sup _{n\rightarrow \infty }{\frac {1}{n}}\sum _{i=1}^{n}\log a_{i}\leq \log 2<1$ .

So $\log |\eta _{n}|$  converges to $-\infty$  a.s. So $\eta _{n}$  converges to 0 a.s.

(b). In this case

$\lim \sup _{n\rightarrow \infty }{\frac {1}{n}}\sum _{i=1}^{n}\log a_{i}=\infty$

So $\log |\eta _{n}|$  converges to $\infty$  a.s. So $|\eta _{n}|$  converges to $\infty$  almost surely. However the sign of $\eta _{n}$  changes infinitely often a.s. (by the second Borel-Cantelli lemma), so $\eta _{n}$  has no limit almost surely.

## Problem 4

 Let $(M_{t},{\mathcal {F}}_{t}),t\geq 0$ , be a positive continuous martingale that converges to zero almost surely as $t\rightarrow \infty$ . (a) Prove that for each $\lambda >0$ $\mathbb {P} (\sup _{t\geq 0}M_{t}\geq \lambda |{\mathcal {F}}_{0})=\min(1,M_{0}/\lambda )$ . (b) Let $W_{t}$ be a standard one-dimensional Brownian motion. Prove that for each $a>0$ the random variable $\sup _{t\geq 0}(W_{t}-{\frac {1}{2}}at)$ has exponential distribution with parameter $a$ .

### Solution

(a) Let $\tau$  be the stopping time given by $\min(s,u)$  where u is the first time $M_{t}=\lambda$ . By the optional stopping theorem,

$M_{0}=E[M_{\lambda }]=\mathbb {P} (\sup _{0\leq t\leq s}M_{t}\geq \lambda |{\mathcal {F}}_{0})\max(\lambda ,M_{0})+\mathbb {P} (\sup _{0\leq t\leq s}M_{t}<\lambda |{\mathcal {F}}_{0})E[X_{s}|\sup _{0\leq t\leq s}M_{t}<\lambda ,{\mathcal {F}}_{0}]$

Since $M_{t}$  converges to zero a.s., taking the limit as s goes to infinity we have $\mathbb {P} (\sup _{t\geq 0}M_{t}\geq \lambda |{\mathcal {F}}_{0})=M_{0}/\lambda$  if $\lambda >M_{0}$ .

Clearly, if $M_{0}\geq \lambda$ , $\mathbb {P} (\sup _{t\geq 0}M_{t}\geq \lambda |{\mathcal {F}}_{0})=1$ .

(b) Wald's martingale $X_{t}=e^{aW_{t}-{\frac {1}{2}}a^{2}t}$  is a well-known martingale.

$\mathbb {P} (\sup _{t\geq 0}(W_{t}-{\frac {1}{2}}at)\geq s)=\mathbb {P} (\sup _{t\geq 0}(aW_{t}-{\frac {1}{2}}a^{2}t)\geq as)$

$=\mathbb {P} (\sup _{t\geq 0}e^{(aW_{t}-{\frac {1}{2}}a^{2}t)}\geq e^{as})=\mathbb {P} (\sup _{t\geq 0}X_{t}\geq e^{as})$

By (a),

$\mathbb {P} (\sup _{t\geq 0}X_{t}\geq e^{as})=\min(1,e^{-as})=e^{-as}$

## Problem 5

 Suppose that $\xi$ is a bounded random variable on $(X,{\mathcal {B}}(X),P)$ where X is a metric space, ${\mathcal {B}}(X)$ is its Borel $\sigma$ -algebra, and P is a probability measure. Suppose that $\xi (x)\neq 0$ for all $x\in X$ . Prove that there is a bounded continuous function $\eta$ on X such that $E[\xi \eta ]>0$ .

### Solution

Suppose $|\xi (x)|  for all x. By Lusin's theorem, for all $\epsilon$ , there exists a continuous $f_{\epsilon }$  such that $\mathbb {P} (\xi \neq f_{\epsilon })<\epsilon$  and $|f_{\epsilon }(x)|  for all x.

$E[\xi ^{2}]=C>0$

$E[\xi ^{2}]=\mathbb {P} (\xi =f_{\epsilon })E[\xi f_{\epsilon }|\xi =f_{\epsilon }]+\mathbb {P} (\xi \neq f_{\epsilon })E[\xi ^{2}|\xi \neq f_{\epsilon }]$

$E[\xi f_{\epsilon }]=\mathbb {P} (\xi =f_{\epsilon })E[\xi f_{\epsilon }|\xi =f_{\epsilon }]+\mathbb {P} (\xi \neq f_{\epsilon })E[\xi f_{\epsilon }|\xi \neq f_{\epsilon }]$

$=C-\mathbb {P} (\xi \neq f_{\epsilon })E[\xi ^{2}|\xi \neq f_{\epsilon }]+\mathbb {P} (\xi \neq f_{\epsilon })E[\xi f_{\epsilon }|\xi \neq f_{\epsilon }]$

$>C-2\epsilon B^{2}$

Taking $\epsilon   and $\eta =f_{\epsilon }$  the result follows.

## Problem 6

 Suppose that n points are independently and uniformly distributed on a unit circle. Let $\xi _{n}$ be the smallest of the distances between the points. Prove that $n\xi _{n}$ tends to zero in probability as $n\rightarrow \infty$ .

### Solution

Let $C>0$ . Partition the circle into $\lfloor Cn^{2}\rfloor$  non-intersecting arcs of equal length. Assume $\lfloor Cn^{2}\rfloor >>n$  The probability that n points are selected at random and at least two fall into the same arc is given by

$1-\prod _{i=0}^{n-1}(\lfloor Cn^{2}\rfloor -i)/(\lfloor Cn^{2}\rfloor )$

$1-\prod _{i=0}^{n-1}(1-i/(\lfloor Cn^{2}\rfloor ))\approx 1-e^{-1/2C}$

Taking C sufficiently small, we can ensure

$e^{-1/2C}<\epsilon$

If two points fall into the same arc, the distance between them is less than

$2\pi /\lfloor Cn^{2}\rfloor$  and $n\xi _{n}<{\frac {2\pi n}{\lfloor Cn^{2}\rfloor }}$

which tends to 0. Since $\epsilon$  may be taken to be arbitrarily small, we get that $n\xi _{n}$  tends to zero in probability.