UMD Probability Qualifying Exams/Jan2015Probability

Problem 1 edit

A die is rolled infinitely many times.

(a) Find the expected number of combinations (6,6) among the first 1000 rolls. Find the expected number of combinations (1,2) among the first 1000 rolls.

(b) What is larger and by how much: the expected number of rolls till the first occurrence of (6,6) or the expected number of rolls till the first occurrence of (1,2)?

(c) What is the probability that (6,6) occurs before (1,2)?

Solution edit

(a) Let   be the value of the nth roll. The number of (6,6)'s in the first 1000 rolls is given by  

By linearity of expectation,

 

 

Similarly, the number of (1,2)'s in the first 1000 rolls is 999/36.

(b) For the general approach to solve such problems see [1]. The expected waiting time till the first (6,6) is 6+36=42, while the expected waiting time till the first (1,2) is 0+36=36.

The argument is as follows. Suppose we continue rolling dice until we hit the desired pattern. At each time n a new gambler begins a game with 1 coin. On a gambler's i-th bet, he bets the i-th character of the pattern will occur. If the gambler loses a bet, he forfeits all his coins. Otherwise he has 6 times as many as he started with. This is a martingale, so at the end of the game, by the optional stopping theorem the expected total sum of all gamblers' coins is equal to the number of gamblers who joined the game, i.e. the expected waiting time for the pattern.

For the pattern (6,6), the second to last gambler has 6 coins and the last gambler has 36 coins. The others have 0.

For the pattern (1,2), the last gambler has 36 coins. The others have 0.

  • Note the optional stopping theorem applies because the expected value of the stopping time is finite and the martingale increments are bounded.
  • A more direct way to do this problem is with absorbing Markov chains

(c) Since the patterns do not overlap, the odds that (6,6) occurs first are given by the expected waiting times, 36 : 42. That is the probability (6,6) occurs first is 36/(42+36)=6/13

For patterns which do overlap, the analysis is slightly more complicated, see Corollary 3.2 in the Shuo-Yen Li paper [1].

[1] A Martingale Approach to the Study of Occurrence of Sequence Patterns in Repeat Experiments, Shuo-Yen Li, Annals of Probability

Problem 2 edit

We'll say that a random variable is discrete if the measure it induces is discrete. A random variable is absolutely continuous if the measure it induces is absolutely continuous. Prove or disprove.

(a) A sum of two discrete random variables is always a discrete random variable

(b) A sum of a discrete and an absolutely continuous random variable is always an absolutely continuous random variable.

(c) A sum of two absolutely continuous random variables is always an absolutely continuous random variable.

(d) A sum of two independent discrete random variables is always a discrete random variable.

(e) A sum of a discrete and an absolutely continuous random variable that are independent is always an absolutely continuous random variable.

(f) A sum of two independent absolutely continuous random variables is always an absolutely continuous random variable.

Solution edit

(a) True. A random variable is discrete iff its range is countable. The range of the sum of two random variables A, B is a subset of the sum-set range(A)+range(B). The sum-set is countable if range(A), and range(B) are countable.

(b). True. A random variable X is absolutely continuous iff   for any Lebesgue-null set E. Let Y be a discrete random variable.   since a countable sum of Lebesgue-null sets is Lebesgue-null.

(c). False. For any absolutely continuous random variable X, -X is also absolutely continuous, and X + (-X) =0 which is a discrete random variable.

(d). True, by (a)

(e). True, by (b)

(f). True, the distribution function is given by the convolution theorem.

Problem 3 edit

Suppose that   are independent random variables uniformly distributed on the intervals   respectively, where   are positive. Let  

(a). Prove that   tend to zero almost surely if  .

(b). Prove that   do not have a limit if  .

Solution edit

(a). Let   where   is uniformly distributed in  .

 

 

By the strong law off large numbers,

  converges to -1 almost surely.

 .

So   converges to   a.s. So   converges to 0 a.s.

(b). In this case

 

So   converges to   a.s. So   converges to   almost surely. However the sign of   changes infinitely often a.s. (by the second Borel-Cantelli lemma), so   has no limit almost surely.

Problem 4 edit

Let  , be a positive continuous martingale that converges to zero almost surely as  .

(a) Prove that for each  

 .

(b) Let   be a standard one-dimensional Brownian motion. Prove that for each   the random variable

 

has exponential distribution with parameter  .

Solution edit

(a) Let   be the stopping time given by   where u is the first time  . By the optional stopping theorem,

 

Since   converges to zero a.s., taking the limit as   goes to infinity we have   if  .

Clearly, if  ,  .

(b) Wald's martingale   is a well-known martingale.

 

 

By (a),

 

Problem 5 edit

Suppose that   is a bounded random variable on   where X is a metric space,   is its Borel  -algebra, and P is a probability measure. Suppose that   for all  . Prove that there is a bounded continuous function   on X such that  .

Solution edit

Suppose   for all x. By Lusin's theorem, for all  , there exists a continuous   such that   and   for all x.

 

 

 

 

 

Taking   and   the result follows.

Problem 6 edit

Suppose that n points are independently and uniformly distributed on a unit circle. Let   be the smallest of the distances between the points. Prove that   tends to zero in probability as  .

Solution edit

Let  . Partition the circle into   non-intersecting arcs of equal length. Assume   The probability that n points are selected at random and at least two fall into the same arc is given by

 

 

Taking C sufficiently small, we can ensure

 

If two points fall into the same arc, the distance between them is less than

  and  

which tends to 0. Since   may be taken to be arbitrarily small, we get that   tends to zero in probability.