# UMD Probability Qualifying Exams/Jan2011Probability

## Problem 1

 A person plays an infinite sequence of games. He wins the $n$ th game with probability $1/{\sqrt {n}}$ , independently of the other games. (i) Prove that for any $A$ , the probability is one that the player will accumulate $A$ dollars if he gets a dollar each time he wins two games in a row. (ii) Does the claim in part (i) hold true if the player gets a dollar only if he wins three games in a row? Prove or disprove it.

### Solution

(i): Define the person's game as the infinite sequence $\omega =\{\omega _{1},\omega _{2},...\}$  where each $\omega _{k}$  equals either 1 (corresponding to a win) or 0 (corresponding to a loss).

Define the random variable $\tau _{k}:\Omega \to \mathbb {N}$  by

$\tau _{k}(\omega )=\{\#{\text{ times }}\omega _{j}=\omega _{j-1}=1,\,\forall 2\leq j\leq k\}$  that is, $\tau _{k}$  counts how many times the player received two consecutive wins in his first $k$  games. Thus, the player will win $\tau _{k}$  dollars in the first $k$  games. Clearly, $\tau _{k}$  is measurable. Moreover, we can compute the expectation:

$E[\tau _{k}(\omega )]=\sum _{j=2}^{k}{\frac {1}{\sqrt {j}}}{\frac {1}{\sqrt {j-1}}}.$

Now observe what happens as we send $k\to \infty$ :

$E[\lim _{k\to \infty }\tau _{k}(\omega )]=\lim _{k\to \infty }\sum _{j=2}^{k}{\frac {1}{\sqrt {j}}}{\frac {1}{\sqrt {j-1}}}=\infty$

Hence the expected winnings of the infinite game is also infinite. This implies that the player will surpass \$$A$  in winnings almost surely.

(ii): Define everything as before except this time $\tau _{k}(\omega )=\{\#{\text{ times }}\omega _{j}=\omega _{j-1}=\omega _{j-2}=1,\,\forall 3\leq j\leq k\}.$

Then $E[\tau _{k}(\omega )]=\sum _{j=2}^{k}{\frac {1}{\sqrt {j}}}{\frac {1}{\sqrt {j-1}}}{\frac {1}{\sqrt {j-2}}}.$  which gives $E[\lim _{k\to \infty }\tau _{k}(\omega )]<\infty .$  Thus we cannot assert that the probability of surpassing any given winnings will equal 1.

## Problem 2

 There are 10 coins in a bag. Five of them are normal coins, one coin has two heads and four coins have two tails. You pull one coin out, look at one of its sides and see that it is a tail. What is the probability that it is a normal coin?

### Solution

This is just a direct application of Bayes' theorem. Let $N$  denote the event that you pulled a normal coin. Let $T$  denote the even that you have a tail.

By Bayes,

$P(N|T)={\frac {P(N\cap T)}{P(T)}}={\frac {5/20}{13/20}}=5/13.$

The probability of seeing a tail on a normal coin, $P(N\cap T)$  is 5/20 since there are five tails on normal coins out of all 20 faces. The probability of seeing a tail is 13 out of 20 (5 normal + 2*4 double).

## Problem 3

 Let $X_{n}$ be a Markov chain with state space $N$ , with transition probabilities $P(z,z^{2})=P(z,z-1)=1/2$ for $z\geq 2$ , $P(z,z+1)=1$ for $z=1$ . (i) Find a strictly monotonically decreasing non-negative function $f:\mathbb {N} \to \mathbb {R} ^{+}$ such that $f(X_{n})$ is a supermartingale. (ii) Prove that for each initial distribution $P(\lim _{n\to \infty }X_{n}=+\infty )$ ### Solution

(i) Let $P$  be the Markov transition matrix. I claim that for any initial probability distribution, $\mu$ , then $E[\mu ]\leq E[\mu P]$ .

Proof of claim: It is sufficient to consider the case where the initial distribution is singular, i.e. $\mu =\chi _{n}$ . Clearly we can see that $n=E[\chi _{n}]$ . Then $E[\chi _{n}P]=2$  if $n=1$  and for $n\geq 2$  we have $E[\chi _{n}P]=1/2(n-1)+1/2(n^{2})\geq n$ .

Now let $f(n)=1/n$ . We want to compute $E[f(X_{t})-f(X_{s})|{\mathcal {F}}_{s}]$  for $t>s$ .

$E[f(X_{t})-f(X_{s})|{\mathcal {F}}_{s}]=E[f(X_{s}P^{t-s})]-E[f(X_{s})]=E[1/(X_{s}P^{t-s})]-E[1/(X_{s})]\geq 0$  where the last inequality comes from our claim above. This shows that $f(X_{s})$  is a supermartingale.

## Problem 4

 Let $\xi _{n}$ be i.i.d. random variables with $P(\xi _{n}=-1)=P(\xi _{n}=1)=1/2$ . (i) Prove that the series $\sum _{n=1}^{\infty }e^{-n}\xi _{n}$ converges with probability one. (ii) Prove that the distribution of $\xi =\sum _{n=1}^{\infty }e^{-n}\xi _{n}$ is singular, i.e., concentrated on a set of Lebesgue measure zero.

### Solution

(i) Notice that

$|\sum _{n=1}^{\infty }e^{-n}\xi _{n}|\leq \sum _{n=1}^{\infty }e^{-n}={\frac {1}{1-e}}.$  So the series is bounded. Moreover, it must be Cauchy. Indeed for any $\epsilon >0$  we can select $N$  sufficiently large so that for every $n,m>N$ , $\sum _{k=n}^{m}e^{-k}<\epsilon .$  Hence, the series $\sum _{n=1}^{\infty }e^{-n}\xi _{n}$  converges almost surely.

(ii) To show that $\xi$  is supported on a set of Lebesgue measure zero, first recall some facts about the Cantor set.

The Cantor set $C$  is the set of all $x\in [0,1]$  with ternary expansion $x=.a_{1}a_{2}\cdots ,\,a_{j}=0{\text{ or }}2$  (in base 3). This corresponds to the usual Cantor set which can be thought of the perfect symmetric set with contraction 1/3.

Instead, consider the set $E$  consisting of all $x\in [0,1]$  with expansion $x=.a_{1}a_{2}\cdots ,\,a_{j}=0{\text{ or }}2$  in base $e$ . There exists an obvious bijection between the elements of $E$  and $\xi$ . Since the Lebesgue measure of $E$  is $\lim _{n\to \infty }({\frac {2}{e}})^{n}=0$ . Hence $\xi$  has support on a set of Lebesgue measure zero.

## Problem 5

 Let $\xi _{n}$ be a sequence of independent random variables with $\xi _{n}$ uniformly distributed on $[0,n^{2}]$ . Find $a_{n}$ and $b_{n}$ such that $(\sum _{i=1}^{n}\xi _{i}-a_{n})/b_{n}$ converges in distribution to a nondegenerate limit and identify the limit.

### Solution

This is a direct appliction of Central Limit Theorem, Lindeberg Condition.

We know that each random variable $\xi _{i}$  has mean $i^{2}/2$  and variance $i^{4}/12$ .

Then $a_{n}=\sum _{i=1}^{n}i^{2}/2={\frac {n(n+1)(2n+1)}{12}}$  and $b_{n}^{2}=\sum _{i=1}^{n}i^{4}/12$ . Then $(\sum _{i=1}^{n}\xi _{i}-a_{n})/b_{n}$  converges in distribution to the standard normal provided the Lindeberg condition holds.

Hence we want to check $\lim _{n\to \infty }{\frac {1}{b_{n}^{2}}}\sum _{i=1}^{n}\int _{\{x:|x-m_{i}|\geq \epsilon b_{n}\}}(x-m_{i})^{2}\,dF_{i}(x)=0$

Since $b_{n}$  grows faster than $n^{2}$  then for sufficiently large $n$ , the domain of each integral is empty. Hence the above equation goes to 0 as $n\to \infty$ . Thus the Lindeberg condition is satisfied and CLT holds.

## Problem 6

 (i) Let $X_{t},t>0$ be random variables defined on a probability space $(\Omega ,{\mathcal {F}},P)$ . Assuming that $E|X_{t}|^{2}=E|X|^{2}<\infty$ for all $t$ , prove that $P(\lim _{t\to 0}X_{t}=X)=1$ implies $\lim _{t\to 0}E|X_{t}-X|^{2}=0$ , i.e. under the above assumptions, almost sure convergence implies convergence in mean square. (ii) Let$X_{t},t\in \mathbb {R}$ be a random process with the property that $EX_{t}$ and $C(h)=E(X_{t}X_{h+t})$ are finite and do not depend on $t$ (such a process is called wide-sense stationary). Prove that the correlation function $C(h)$ is continuous if the trajectories of $X_{t}$ are continuous.

### Solution

(i) Let $A=\{\omega \in \Omega |\lim _{t\to 0}X_{t}(\omega )=X(\omega )\}$ . By assumption $P(A)=1$ . Now we compute the $L^{2}$  norm:

$\lim _{t\to 0}E|X_{t}-X|^{2}=\lim _{t\to 0}\int _{A}|X_{t}-X|^{2}\,d\omega +\lim _{t\to 0}\int _{\Omega \setminus A}|X_{t}-X|^{2}\,d\omega .$

Let us evaluate the first integral on the right-hand side. We can write $\lim _{t\to 0}\int _{A}|X_{t}-X|^{2}\,d\omega =\lim _{t\to 0}\int _{A}|X_{t}|^{2}\,d\omega +\int _{A}|X|^{2}\,d\omega -2\lim _{t\to 0}\int _{A}|XX_{t}|^{2}\,d\omega$

$\quad \leq \lim _{t\to 0}E|X_{t}|^{2}+E|X|^{2}-2\int _{A}\lim _{t\to 0}|XX_{t}|^{2}\,d\omega$  by Fatou's lemma

$=E|X|^{2}+E|X|^{2}-2E|X|^{2}=0$  (since $E|X_{t}|^{2}=E|X|^{2}\,\forall t$ ).

Now the second term:

$\lim _{t\to 0}\int _{\Omega \setminus A}|X_{t}-X|^{2}\,d\omega \leq \int _{\Omega \setminus A}|X|^{2}\,d\omega +\lim _{t\to 0}\int _{\Omega \setminus A}|X_{t}|^{2}\,d\omega$  by the triangle inequality.

$\leq (E|X|^{2}+E|X|^{2})P(\Omega \setminus A)=0$  since $X,X_{t}$  all have finite second moments.

Thus we have just shown that under the above assumptions, almost sure convergence implies convergence in mean square.

(ii)