# Probability/Joint Distributions and Independence

## Motivation

Suppose we are given a pmf of a discrete random variable $X$  and a pmf of a discrete random variable $Y$ . For example,

$f_{X}(x)=(\mathbf {1} \{x=0\}+\mathbf {1} \{x=1\})/2\quad {\text{and}}\quad f_{Y}(y)=(\mathbf {1} \{y=0\}+\mathbf {1} \{y=2\})/2$

We cannot tell the relationship between $X$  and $Y$  with only such information. They may be related or not related.

For example, the random variable $X$  may be defined as $X=1$  if head comes up and $X=0$  otherwise from tossing a fair coin, and the random variable $Y$  may be defined as $Y=2$  if head comes up and $Y=0$  otherwise from tossing the coin another time. In this case, $X$  and $Y$  are unrelated.

Another possibility is that the random variable $Y$  is defined as $Y=2X$  if head comes up in the first coin tossing, and $Y=0$  otherwise. In this case, $X$  and $Y$  are related.

Yet, in the above two examples, the pmf of $X$  and $Y$  are exactly the same.

Therefore, to tell the relationship between $X$  and $Y$ , we define the joint cumulative distribution function, or joint cdf.

## Joint distributions

Definition. (Joint cumulative distribution function) Let $X_{1},\dotsc ,X_{n}$  be random variables defined on a sample space $\Omega$ . The joint cumulative distribution function (cdf) of random variables $X_{1},\dotsc ,X_{n}$  is

$F(x_{1},\dotsc ,x_{n})=\mathbb {P} (X_{1}\leq x_{1}\cap \cdots \cap X_{n}\leq x_{n})=\mathbb {P} \left(\bigcap _{i=1}^{n}\{\omega \in \Omega :X_{i}(\omega )\leq x_{i}\}\right).$

Sometimes, we may want to know the random behaviour in one of the random variables involved in a joint cdf. We can do this by computing the marginal cdf from joint cdf. The definition of marginal cdf is as follows:

Definition. (Marginal cumulative distribution function) The cumulative distribution function $F_{X_{i}}$  of each random variable $X_{i}$  is marginal cumulative distribution function (cdf) of $X_{i}$  which is a member in the $n$  random variables $X_{1},\dotsc ,X_{n}$ .

Remark. Actually, the marginal cdf of $X_{i}$  is simply the cdf of $X_{i}$  (which is in one variable). We have already discussed this kind of cdf in previous chapters.

Proposition. (Obtaining marginal cdf from joint cdf) Given a joint cdf $F(x_{1},\dotsc ,x_{n})$ , the marginal cdf of $X_{i}$  is

$F_{X_{i}}(x)=F(\infty ,\dotsc ,\infty ,\underbrace {x} _{i{\text{-th position}}},\infty ,\dotsc ,\infty ).$

Proof. When we set the arguments other than $i$ -th argument to be $\infty$ , e.g. $X_{1}\leq \infty \Leftrightarrow \lim _{x\to \infty }X_{1}\leq x$  , the joint cdf becomes

{\begin{aligned}\mathbb {P} (X_{1}\leq \infty \cap \cdots \cap X_{i-1}\leq \infty \cap X_{i}\leq x\cap X_{i+1}\leq \infty \cap \cdots \cap X_{n}\leq \infty )&=\underbrace {\mathbb {P} (X_{1}\leq \infty \cap \cdots \cap X_{i-1}\leq \infty )} _{1}\mathbb {P} (X_{i}\leq x)\underbrace {\mathbb {P} (X_{i+1}\leq \infty \cap \cdots \cap X_{n}\leq \infty )} _{1}\qquad {\text{by independence}}\\&=\mathbb {P} (X_{i}\leq x)\\&=F_{X_{i}}(x)\end{aligned}}

$\Box$

Remark. In general, we cannot deduce the joint cdf from a given set of marginal cdf's.

Example. Consider the joint cdf of random variables $X$  and $Y$ :

$F(x,y)=1-e^{-x}-e^{-y}+e^{-xy}.$

The marginal cdf of $Y$  is
$F_{Y}(y)=\lim _{x\to \infty }(1-e^{-x}-e^{-y}+e^{-xy})=1-e^{-y}.$

Similar to the one-variable case, we have joint pmf and joint pdf. Also, analogously, we have marginal pmf and marginal pdf.

Definition. (Joint probability mass function) The joint probability mass function (joint pmf) of $X_{1},\dotsc ,X_{n}$  is

$f(x_{1},\dotsc ,x_{n})=\mathbb {P} {\big (}(X_{1},\dotsc ,X_{n})=(x_{1},\dotsc ,x_{n}){\big )},\quad (x_{1},\dotsc ,x_{n})\in \mathbb {R} ^{n}.$

Definition. (Marginal probability mass function) The marginal probability mass function (marginal pmf) of each $X_{i}$  which is a member of the $n$  random variables $X_{1},X_{2},\dotsc ,X_{n}$  is

$f_{X_{i}}(x)=\mathbb {P} (X_{i}=x),\quad x\in \mathbb {R} .$

Proposition. (Obtaining marginal pmf from joint pmf) For discrete random variables $X_{1},\dotsc ,X_{n}$  with joint pmf $f$ , the marginal pmf of $X_{i}$  is

$f_{X_{i}}({\color {red}x})=\underbrace {\sum _{u_{1}}\cdots \sum _{u_{i-1}}\sum _{u_{i+1}}\cdots \sum _{u_{n}}} _{n-1\;{\text{summations}}}f(u_{1},\dotsc ,u_{i-1},{\color {red}x},u_{i+1},\dotsc ,u_{n}).$

Proof. Consider the case in which there are only two random variables, say $X$  and $Y$ . Then, we have

$\sum _{\color {green}y}f({\color {red}x},{\color {green}y})=\sum _{\color {green}y}\mathbb {P} (X={\color {red}x}\cap {\color {green}Y=y})=\mathbb {P} (X={\color {red}x})\qquad {\text{by law of total probability}}.$

Similarly, in general case, we have
{\begin{aligned}\sum _{\color {green}u_{n}}f(u_{1},\dotsc ,u_{i-1},{\color {red}x},u_{i+1},\dotsc ,{\color {green}u_{n}})&=\sum _{\color {green}u_{n}}\mathbb {P} (X_{1}\leq u_{1}\cap \cdots \cap X_{i-1}u_{i-1}\cap X_{i}\leq {\color {red}x}\cap X_{i+1}\leq u_{i+1}\cap \cdots \cap X_{n-1}\leq u_{n-1}\cap {\color {green}X_{n}\leq u_{n}})\\&=\mathbb {P} (X_{1}\leq u_{1}\cap \cdots \cap X_{i-1}u_{i-1}\cap X_{i}\leq {\color {red}x}\cap X_{i+1}\leq u_{i+1}\cap \cdots \cap X_{n-1}\leq u_{n-1})\qquad {\text{by law of total probability}}.\end{aligned}}

Then, we perform similar process on each of the other variables ($n-2$  left), with one extra summation sign added for each process. Thus, in total we will have $n-1$  summation sign, and we will finally get the desired result. $\Box$

Remark. This process may sometimes be called 'summing over each possible value of other variables'.

Example. Suppose we throw a fair six-faced dice two times. Let $X$  be the number facing up in the first throw, and $Y$  be the number facing up in the second throw. Then, the joint pmf of $(X,Y)$  is

$f(x,y)=\mathbb {P} (X=x\cap Y=y)={\frac {1}{6}}\cdot {\frac {1}{6}}={\frac {1}{36}}.$

in which $x,y\in \{1,2,3,4,5,6\}$ , and $f(x,y)=0$  otherwise. Also, the marginal pmf of $X$  is
$f_{X}(x)=\sum _{y}f(x,y)=f(x,1)+f(x,2)+\cdots +f(x,6)=6(1/36)={\frac {1}{6}}$

in which $x\in \{1,2,3,4,5,6\}$ , and $f_{X}(x)=0$  otherwise.

By symmetry (replace all $X$  with $Y$  and replace all $x$  with $y$ ), the marginal pmf of $Y$  is

$f_{Y}(y)={\frac {1}{6}}$

in which $y\in \{1,2,3,4,5,6\}$ , and $f_{Y}(x)=0$  otherwise.

Exercise. Suppose there are two red balls and one blue ball in a box, and we draw two balls one by one from the box with replacement. Let $X=1$  if the ball from the first draw is red, and $X=0$  otherwise. Let $Y=1$  if the ball from the second draw is red, and $Y=0$  otherwise.

1 Calculate the marginal pmf of $X$ .

 $f_{X}(x)=(\mathbf {1} \{x=0\}+2\cdot \mathbf {1} \{x=1\})/3$ $f_{X}(x)=(\mathbf {1} \{x=1\}+2\cdot \mathbf {1} \{x=0\})/3$ $f_{X}(x)=2/3$ $f_{X}(x)=(\mathbf {1} \{x=1\}+\mathbf {1} \{x=0\})/2$ 2 Calculate the joint pmf of $(X,Y)$ .

 $f(x,y)=(1/9)(\mathbf {1} \{(x,y)=(0,0)\}+2\cdot \mathbf {1} \{(x,y)=(0,1)\}+2\cdot \mathbf {1} \{(x,y)=(1,0)\}+4\cdot \mathbf {1} \{(x,y)=(1,1)\}$ ) $f(x,y)=(1/9)(4\cdot \mathbf {1} \{(x,y)=(0,0)\}+2\cdot \mathbf {1} \{(x,y)=(0,1)\}+2\cdot \mathbf {1} \{(x,y)=(1,0)\}+\mathbf {1} \{(x,y)=(1,1)\}$ ) $f(x,y)=(2/9)(\mathbf {1} \{(x,y)=(0,0)+\mathbf {1} \{(x,y)=(0,1)\}+\mathbf {1} \{(x,y)=(1,0)\}+\mathbf {1} \{(x,y)=(1,1)\}$ )

Exercise. Recall the example in the motivation section.

(a) Suppose we toss a fair coin twice. Let $X=\mathbf {1} \{{\text{head comes up}}\}$  and $Y=2\cdot \mathbf {1} \{{\text{head comes up}}\}$ . Show that joint pmf of $(X,Y)$  is

$f(x,y)={\frac {\mathbf {1} \{x\in \{0,1\}\cap y\in \{0,2\}\}}{4}}.$

(b) Suppose we toss a fair coin once. Let $X=\mathbf {1} \{{\text{head comes up}}\}$  and $Y=2X$ . Show that joint pmf of $(X,Y)$  is

$f(x,y)={\frac {\mathbf {1} \{x\in \{0,1\}\cap y=2x\}}{2}}.$

(c) Show that marginal pmf of $X$  and $Y$  are

$f_{X}(x)={\frac {\mathbf {1} \{x\in \{0,1\}\}}{2}}\quad {\text{and}}\quad f_{Y}(y)={\frac {\mathbf {1} \{y\in \{0,2\}\}}{2}}$

in each of the situations in (a) and (b). (Hint: for part (b), we need to put value in the variable in the indicator)

Proof.

(a) Since the support of $(X,Y)$  is $\{(0,0),(0,2),(1,0),(1,2)\}$ , the joint pmf of $(X,Y)$  is

$f(x,y)=\mathbb {P} (X=0\cap Y=0)+\mathbb {P} (X=0\cap Y=2)+\mathbb {P} (X=1\cap Y=0)+\mathbb {P} (X=1\cap Y=2)={\frac {\mathbf {1} \{x\in \{0,1\}\cap y\in \{0,2\}\}}{4}}.$

(b) Since the support of $(X,Y)$  is $x\in \{0,1\}\cap y=2x$ , the joint pmf of $(X,Y)$  is

$f(x,y)=\mathbb {P} (X=0\cap Y=2(0)=0)+\mathbb {P} (X=1\cap Y=2(1)=2)={\frac {\mathbf {1} \{x\in \{0,1\}\cap y=2x\}}{2}}.$

(c) Part (a): marginal pmf of $X$  is

$f_{X}(x)=f(x,0)+f(x,2)={\frac {2\mathbf {1} \{x\in \{0,1\}\}}{4}}={\frac {\mathbf {1} \{x\in \{0,1\}\}}{2}},$

and marginal pmf of $Y$  is
$f_{Y}(y)=f(0,y)+f(1,y)={\frac {2\mathbf {1} \{y\in \{0,2\}\}}{4}}={\frac {\mathbf {1} \{y\in \{0,2\}\}}{2}}.$

Part (b): The marginal pmf of $X$  is

$f_{X}(x)=f(x,0)+f(x,2)={\frac {\mathbf {1} \{\overbrace {x\in \{0,1\}\cap 0=2x} ^{x=0}\}}{2}}+{\frac {\mathbf {1} \{\overbrace {x\in \{0,1\}\cap 2=2x} ^{x=1}\}}{2}}={\frac {\overbrace {\mathbf {1} \{x=0\}+\mathbf {1} \{x=1\}} ^{\mathbf {1} \{x=0\cup x=1\}}}{2}}={\frac {\mathbf {1} \{x\in \{0,1\}\}}{2}}.$

Similarly, the marginal pmf of $Y$  is
$f_{Y}(y)=f(0,y)+f(1,y)={\frac {\mathbf {1} \{0\in \{0,1\}\cap y=0\}}{2}}+{\frac {\mathbf {1} \{1\in \{0,1\}\cap y=2\}}{2}}={\frac {\overbrace {\mathbf {1} \{0\in \{0,1\}\}} ^{1}\mathbf {1} \{y=0\}+\overbrace {\mathbf {1} \{1\in \{0,1\}\}} ^{1}\mathbf {1} \{y=2\}}{2}}={\frac {\mathbf {1} \{y\in \{0,2\}\}}{2}}.$

For jointly continuous random variables, the definition is generalized version of the one for continuous random variables (univariate case).

Definition. (Jointly continuous random variable) Random variables $X_{1},\dotsc ,X_{n}$  are jointly continuous if

$\mathbb {P} {\big (}(X_{1},\dotsc ,X_{n})\in S{\big )}=\int \dotsi \int _{S}f(x_{1},\dotsc ,x_{n})\,dx_{1}\cdots \,dx_{n},\quad S\subseteq \mathbb {R} ^{n},$

for some nonnegative function $f$ .

Remark.

• The function $f$  is the joint probability density function (joint pdf) of $X_{1},\dotsc ,X_{n}$ .
• Similarly, $f(x_{1},\dotsc ,x_{n})\,dx_{1}\cdots \,dx_{n}$  can be interpreted as the probability over the 'infinitesimal' region $[x_{1},x_{1}+dx_{1}]\times \dotsb \times [x_{n},x_{n}+dx_{n}]$ , and $f(x_{1},\dotsc ,x_{n})$  can be interpreted as the density of the probability over that 'infinitesimal' region, i.e. ${\frac {\mathbb {P} {\big (}X\in [x_{1},x_{1}+dx_{1}]\times \dotsb \times [x_{n},x_{n}+dx_{n}]{\big )}}{dx_{1}\dotsb dx_{n}}}$ , intuitively and non-rigourously.
• By setting $S=(-\infty ,x_{1}]\times \dotsb \times (-\infty ,x_{n}]$ , the cdf

$F(x_{1},\dotsc ,x_{n})=\underbrace {\int _{-\infty }^{x_{1}}\cdots \int _{-\infty }^{x_{n}}} _{n\;{\text{integrations}}}f(u_{1},\dotsc ,u_{n})\,du_{n}\cdots \,du_{1},$

which is similar to the univariate case.

Definition. (Marginal probability density function) The pdf $f_{X_{i}}$  of each $X_{i}$  which is a member of the $n$  random variables $X_{1},X_{2},\dotsc ,X_{n}$  is the marginal probability density function (marginal pdf) of $X_{i}$ .

Proposition. (Obtaining marginal pdf from joint pdf) For continuous random variables $X_{1},\dotsc ,X_{n}$  with joint pdf $f$ , the marginal pdf of $X_{i}$  is

$f_{X_{i}}({\color {red}x})=\underbrace {\int _{-\infty }^{\infty }\cdots \int _{-\infty }^{\infty }} _{n-1\;{\text{integrations}}}f(u_{1},\dotsc ,u_{i-1},{\color {red}x},u_{i+1},\dotsc ,u_{n})\,du_{1}\cdots \,du_{i-1}\,du_{i+1}\cdots \,du_{n}.$

Proof. Recall the proposition about obtaining marginal cdf from joint cdf. We have

{\begin{aligned}&&F_{X_{i}}({\color {red}x})&=F(\infty ,\dotsc ,\infty ,\overbrace {\color {red}x} ^{i{\text{-th position}}},\infty ,\dotsc ,\infty )\\&\Rightarrow &\int _{-\infty }^{\color {red}x}f_{X_{i}}(u)\,du&=\int _{-\infty }^{\infty }\cdots \int _{-\infty }^{\color {red}x}\cdots \int _{-\infty }^{\infty }f(u_{1},\dotsc ,u_{n})\,du_{n}\cdots \,du_{i}\cdots \,du_{1}\qquad {\text{by definitions}}\\&\Rightarrow &{\frac {d}{dx}}\int _{-\infty }^{\color {red}x}f_{X_{i}}(u)\,du&={\frac {d}{dx}}\int _{-\infty }^{\infty }\cdots \int _{-\infty }^{\color {red}x}\cdots \int _{-\infty }^{\infty }f(u_{1},\dotsc ,u_{n})\,du_{n}\cdots \,du_{i}\cdots \,du_{1}\\&\Rightarrow &f_{X_{i}}({\color {red}x})&=\underbrace {\int _{-\infty }^{\infty }\cdots \int _{-\infty }^{\infty }} _{n-1\;{\text{integrations}}}f(u_{1},\dotsc ,u_{i-1},{\color {red}x},u_{i+1},\dotsc ,u_{n})\,du_{1}\cdots \,du_{i-1}\,du_{i+1}\cdots \,du_{n}\qquad {\text{by fundamental theorem of calculus}}\end{aligned}}

$\Box$

Proposition. (Obtaining joint pdf from joint cdf) If a joint cdf $F$  of jointly continuous random variables has each partial deriviative at $(x_{1},\dotsc ,x_{n})$ , then the joint pdf is

$f(x_{1},\dotsc ,x_{n})={\frac {\partial ^{n}}{\partial x_{1}\cdots \partial x_{n}}}F(x_{1},\cdots ,x_{n}).$

Proof. It follows from using fundamental theorem of calculus $n$  times.

$\Box$

Example. If the joint pdf of jointly continuous random variable $(X,Y)$  is

$f(x,y)={\frac {1}{4}}xy(\mathbf {1} \{x,y\in [0,1]\}),$

the marginal pdf of $X$  is
$f_{X}(x)=\int _{-\infty }^{\infty }{\frac {1}{4}}xy(\mathbf {1} \{x\in [0,1]\}\mathbf {1} \{y\in [0,1]\})\,dy={\frac {1}{4}}x(\mathbf {1} \{x\in [0,1]\})\underbrace {\int _{0}^{1}y\,dy} _{1^{2}/2-0^{2}/2}={\frac {1}{8}}x(\mathbf {1} \{x\in [0,1]\}).$

Also,
$\mathbb {P} ((X,Y)\leq (1/2,1/2))=\int _{-\infty }^{1/2}\int _{-\infty }^{1/2}{\frac {1}{4}}xy(\mathbf {1} \{x,y\in [0,1]\}\,dx\,dy={\frac {1}{4}}\int _{0}^{1/2}y\int _{0}^{1/2}x\,dx\,dy={\frac {1}{4}}\cdot {\frac {(1/2)^{2}}{2}}\int _{0}^{1/2}y\,dy={\frac {1}{4}}\cdot {\frac {1}{8}}\cdot {\frac {1}{8}}={\frac {1}{256}}.$

Exercise. Let $X$  and $Y$  be jointly continuous random variables. Consider the joint cdf of $(X,Y)$ :

$F(x,y)=\mathbf {1} \{x,y\in [0,2]\}{\frac {x^{2}y^{3}}{32}}.$

1 Calculate the joint pdf of $(X,Y)$ .

 $f(x,y)=\mathbf {1} \{x,y\in [0,2]\}{\frac {3xy^{2}}{8}}$ $f(x,y)=\mathbf {1} \{x,y\in [0,2]\}{\frac {3xy^{2}}{16}}$ $f(x,y)=\mathbf {1} \{x,y\in [0,2]\}{\frac {xy^{2}}{16}}$ $f(x,y)=\mathbf {1} \{x,y\in [0,2]\}{\frac {xy^{2}}{8}}$ 2 Calculate the marginal pdf of $X$ .

 $\mathbf {1} \{0\leq y\leq 2\}{\frac {3}{8}}y^{2}$ $\mathbf {1} \{0\leq x\leq 2\}{\frac {1}{2}}x$ $\mathbf {1} \{0\leq y\leq 2\}{\frac {1}{12}}y^{3}$ $\mathbf {1} \{0\leq x\leq 2\}{\frac {1}{8}}x^{2}$ ## Independence

Recall that multiple events are independent if the probability for the intersection of them equals the product of probabilities of each event, by definition. Since $\{X\in A\}$  is also an event, we have a natural definition of independence for random variables as follows:

Definition. (Independence of random variables) Random variables $X_{1},X_{2},\dotsc ,X_{n}$  are independent if

$\mathbb {P} (X_{1}\in A_{1}\cap \cdots \cap X_{n}\in A_{n})=\mathbb {P} (X_{1}\in A_{1})\cdots \mathbb {P} (X_{n}\in A_{n})$

for each $n$  and for each subset $A_{1},A_{2},\dotsc ,A_{n}\subseteq \mathbb {R}$ .

Remark. Under this condition, the events $\{X_{1}\in A_{1}\},\dotsc ,\{X_{n}\in A_{n}\}$  are independent.

Theorem. (Alternative condition for independence of random variables) Random variables $X_{1},X_{2},\dotsc ,X_{n}$  are independent if and only if the joint cdf of $(X_{1},\dotsc ,X_{n})$

$F(x_{1},\dotsc ,x_{n})=F_{X_{1}}(x_{1})\cdots F_{X_{n}}(x_{n})$

or the joint pdf or pmf of $(X_{1},\dotsc ,X_{n})$
$f(x_{1},\dotsc ,x_{n})=f_{X_{1}}(x_{1})\cdots f_{X_{n}}(x_{n})$

for each $x_{1},\dotsc ,x_{n}\in \mathbb {R}$ .

Proof. Partial:

Only if part: If random variables $X_{1},X_{2},\dotsc ,X_{n}$  are independent,

$\mathbb {P} (X_{1}\in A_{1}\cap \cdots \cap X_{n}\in A_{n})=\mathbb {P} (X_{1}\in A_{1})\cdots \mathbb {P} (X_{n}\in A_{n})$

for each $n$  and for each subset $A_{1},A_{2},\dotsc ,A_{n}\subseteq \mathbb {R}$ . Setting $A_{1}=(-\infty ,x_{1}),\dotsc ,A_{n}=(-\infty ,x_{n})$ , and we have
$\mathbb {P} (X_{1}\leq x_{1}\cap \cdots \cap X_{n}\leq x_{n})=\mathbb {P} (X_{1}\leq x_{1})\cdots \mathbb {P} (X_{n}\leq x_{n})\implies F(x_{1},\dotsc ,x_{n})=F_{X_{1}}(x_{1})\cdots F_{X_{n}}(x_{n}).$

Thus, we obtain the result for the joint cdf part.

For the joint pdf part,

{\begin{aligned}&&F(x_{1},\dotsc ,x_{n})&=F_{X_{1}}(x_{1})\cdots F_{X_{n}}(x_{n})\\&\Rightarrow &{\frac {\partial ^{n}}{\partial x_{1}\cdots \partial x_{n}}}F(x_{1},\dotsc ,x_{n})&={\frac {\partial ^{n}}{\partial x_{1}\cdots \partial x_{n}}}\left(F_{X_{1}}(x_{1})\cdots F_{X_{n}}(x_{n})\right)\\&\Rightarrow &f(x_{1},\dotsc ,x_{n})&=f_{X_{n}}(x_{n}){\frac {\partial ^{n}}{\partial x_{1}\cdots \partial x_{n-1}}}\left(F_{X_{1}}(x_{1})\cdots F_{X_{n-1}}(x_{n-1})\right)\\&&&=f_{X_{n}}(x_{n})f_{X_{n-1}}(x_{n-1}){\frac {\partial ^{n}}{\partial x_{1}\cdots \partial x_{n-2}}}\left(F_{X_{1}}(x_{1})\cdots F_{X_{n-2}}(x_{n-2})\right)\\&&&=\cdots =f_{X_{1}}(x_{1})\cdots f_{X_{n}}(x_{n})\end{aligned}}

$\Box$

Remark.

• That is, if joint cdf (joint pdf (pmf)) can be factorized as the product of marginal cdf's (marginal pdf's (pmf's))
• Actually, if we can factorize the joint cdf or joint pdf or joint pmf as the product of some functions in each of the variables, then the condition is also satisfied.

Example. The joint pdf of two independent exponential random variables with rate $\lambda$ , $X$  and $Y$  is

$f(x,y)=(\mathbf {1} \{x\geq 0\}\lambda e^{-\lambda x})(\mathbf {1} \{y\geq 0\}\lambda e^{-\lambda y})=\mathbf {1} \{x,y\geq 0\}\lambda ^{2}e^{-\lambda (x+y)}.$

(Random variables $X$  and $Y$  are said to be independent and identically distributed (i.i.d.) in this case)

In general, the joint pdf of $n$  independent exponential random variables with rate $\lambda$ , $X_{1},\dotsc ,X_{n}$  is

$f(x_{1},\dotsc ,x_{n})=\mathbf {1} \{x_{1},\dotsc ,x_{n}\geq 0\}\lambda ^{n}e^{-\lambda (x_{1}+\cdots +x_{n})}.$

(Random variables $X_{1},\dotsc ,X_{n}$  are also i.i.d. in this case)

On the other hand, if the joint pdf of two random variables $V$  and $W$  are

$f(v,w)=\mathbf {1} \{w\leq 2-2v\},$

random variables $V$  and $W$  are dependent since the joint pdf cannot be factorized as the product of marginal pdf's.

Exercise. Let $X,Y,Z$  be jointly continuous random variables. Consider a joint pdf of $(X,Y,Z)$ :

$f(x,y,z)=\mathbf {1} \{x,y,z\geq 0\}\mathbf {1} \{x+y+z/k\leq 1\}.$

1 Calculate $k$ .

 1 2 3 4

2 Are $X,Y,Z$  independent?

 yes no

Consider another joint pdf of $(X,Y,Z)$ :

$f(x,y,z)=\mathbf {1} \{x,y,z\geq 0\}\mathbf {1} \{y\leq 1-x\}\mathbf {1} \{z\leq k\}$

1 Calculate $k$ .

 1 2 3 4

2 Are $X,Y,Z$  independent?

 yes no

Consider another joint pdf of $(X,Y,Z)$ :

$f(x,y,z)=kxyz\mathbf {1} \{x,y\in [0,1]\}\mathbf {1} \{z\in [0,2]\}.$

1 Calculate $k$ .

 1 2 3 4

2 Are $X,Y,Z$  independent?

 yes no

Proposition. (Independence of events concerning disjoint sets of independent random variables) Suppose random variables $X_{1},X_{2},\dotsc$  are independent. Then, for each $r  and fixed functions $f_{1},f_{2},f_{3},\dotsc$ , the random variables

$Y_{1}=f_{1}(X_{1},\dotsc ,X_{\color {red}r}),\quad Y_{2}=f_{2}(X_{{\color {red}r}+1},\dotsc ,X_{\color {blue}s}),\quad Y_{3}=f_{3}(X_{{\color {blue}s}+1},\dotsc ,X_{t}),\dotsc$

are independent.

Example. Suppose $X_{1},X_{2},X_{3},X_{4}$  are independent Bernoulli random variables with success probability $p$ . Then, $Y_{1}=X_{1}+X_{2}$  and $Y_{2}=X_{3}-X_{4}$  are also independent.

On the other hand, $Y_{1}=X_{1}+X_{2}$  and $Y_{2}=2-X_{3}-X_{2}$  are not independent. A counter-example to the independence is

$\underbrace {\mathbb {P} (Y_{1}=2\cap Y_{2}=2)} _{0}\neq \underbrace {\mathbb {P} (Y_{1}=2)\mathbb {P} (Y_{2}=2)} _{{\text{may}}\;\neq 0}.$

Left hand side equals zero since $Y_{1}=2\implies X_{2}=1$ , but $Y_{2}=2\implies X_{2}=0$ .

Right hand side may not equal zero since $\mathbb {P} (Y_{1}=2)=\mathbb {P} (X_{1}=1\cap X_{2}=1)=p^{2}$ , and $\mathbb {P} (Y_{2}=2)=\mathbb {P} (X_{2}=0\cap X_{3}=0)=(1-p)^{2}$ . We can see that $p^{2}(1-p^{2})$  may not equal zero.

Exercise.

Let $X_{1},\dotsc ,X_{n}$  be i.i.d. random variables, and $Y_{1},\dotsc ,Y_{n}$  also be i.i.d. random variables. Which of the following is (are) true?

 $\sum _{i=1}^{n-1}X_{i}$ and $X_{n}$ are independent. $X_{1}^{X_{2}}$ and $X_{3}^{X_{4}}$ are independent. $\prod _{i=1}^{n}X_{i}$ and $\prod _{i=1}^{n}Y_{i}$ are independent. $X_{1}+X_{2}+X_{3}$ and $Y_{1}+Y_{2}+Y_{3}$ are independent if $X_{1},\dotsc ,X_{n},Y_{1}$ are independent.

### Sum of independent random variables (optional)

In general, we use joint cdf, pdf or pmf to determine the distribution of sum of independent random variables by first principle. In particular, there are some interesting results related to the distribution of sum of independent random variables.

Sum of independent random variables

Proposition. (Convolution of cdf's and pdf's) If the cdf of independent random variables $X$  and $Y$  are $F_{X}$  and $F_{Y}$  respectively, then the cdf of $X+Y$  is

${\color {red}F}_{X+Y}(z)=\int _{-\infty }^{\infty }{\color {red}F}_{X}(z-y)f_{Y}(y)\,dy,$

and the pdf of $X+Y$  is
${\color {blue}f}_{X+Y}(z)=\int _{-\infty }^{\infty }{\color {blue}f}_{X}(z-y)f_{Y}(y)\,dy.$

Proof.

• Continuous case:
• cdf:
{\begin{aligned}F_{X+Y}(z)&=\mathbb {P} (X+Y\leq z)&{\text{by definition}}\\&=\iint _{x+y\leq z}f_{X}(x)f_{Y}(y)\,dx\,dy&{\text{by definition and independence}}\\&=\int _{-\infty }^{\infty }\int _{-\infty }^{z-y}f_{X}(x)f_{Y}(y)\,dx\,dy&{\text{by Fubini's theorem}}\\&=\int _{-\infty }^{\infty }\left(\int _{-\infty }^{z-y}f_{X}(x)\,dx\right)f_{Y}(y)\,dy\\&=\int _{-\infty }^{\infty }F_{X}(z-y)f_{Y}(y)\,dy&{\text{by definition}}.\end{aligned}}

/\
//\ y
///\|
////*
////|\
////|/\
////|//\ x+y=z <=> x=z-y
////|///\
////|////\
----*-----*--------------- x
////|//////\
////|///////\

-->: -infty to z-y
^
|: -infty to infty

*--*
|//| : x+y <= z
*--*

• pdf:
{\begin{aligned}f_{X+Y}(z)&={\frac {d}{dz}}\int _{-\infty }^{\infty }F_{X}(z-y)f_{Y}(y)\,dy\\&=\int _{-\infty }^{\infty }{\frac {d}{dz}}F_{X}(z-y)f_{Y}(y)\,dy&{\text{by fundamental theorem of calculus}}\\&=\int _{-\infty }^{\infty }f_{X}(z-y)f_{Y}(y)\,dy.\end{aligned}}

$\Box$

Remark.

• The cdf and pdf in this case are actually the convolution of the cdf's $F_{X}$  and $F_{Y}$ , and pdf's (pmf's) $f_{X}$  and $f_{Y}$  respectively, and hence the name of the proposition.

Example.

• Let the pdf of $X$  be $f(x)=\mathbf {1} \{0\leq {\color {blue}x}\leq 1\}$ .
• Let the pdf of $Y$  be $f(y)=\mathbf {1} \{-1\leq y\leq 0\}$ .
• Then, the pdf of $X+Y$  is

{\begin{aligned}f_{X+Y}(z)&=\int _{-\infty }^{\infty }\mathbf {1} \{0\leq {\color {blue}z-y}\leq 1\}\mathbf {1} \{-1\leq y\leq 0\}\,dy\\&=\int _{-\infty }^{\infty }\mathbf {1} \{z-1\leq y\leq z\}\mathbf {1} \{-1\leq y\leq 0\}\,dy\\&=\mathbf {1} \{0\leq z\leq 1\}\int _{z-1}^{0}\,dy+\mathbf {1} \{-1\leq z\leq 0\}\int _{z}^{0}\,dy\\&=\mathbf {1} \{0\leq z\leq 1\}(1-z)-z\mathbf {1} \{-1\leq z\leq 0\}.\end{aligned}}

Graphically, the pdf looks like
        y
|
|
|
*  * 1
\ |\
y=-z \| \ y=1-z
-----*--*--*----- z
-1 O|  1
|
-1 *
|


Exercise.

1 Calculate $\mathbb {P} (X=2-Y)$ .

 0 1/4 1/2 3/4 1

2 Calculate $\mathbb {P} (X<2-Y)$ .

 0 1/4 1/2 3/4 1

3 Calculate $k$  such that $\mathbb {P} (X .

 -1/2 -1/4 0 1/4 1/2

Proposition. (Convolution of pmf's) If the pmf of independent random variables $X$  and $Y$  are $f_{X}$  and $f_{Y}$  respectively, then the pmf of $X+Y$  is

$f_{X+Y}(n)=\sum _{k=0}^{n}f_{X}(k)f_{Y}(n-k)=f_{X}(0)f_{Y}(n)+f_{X}(1)f_{Y}(n-1)+\dotsb +f_{X}(n-1)f_{Y}(1)+f_{X}(n)f_{Y}(0).$

Proof.

• Let $E_{i}=\{X=i\}\cap \{Y=n-i\}$ .
• For each nonnegative integer $n$ ,

$\{X+Y=n\}=E_{0}\cup E_{1}\cup \dotsb \cup E_{n}.$

• Since $E_{i}\cap E_{j}=\varnothing$  for each $i\neq j$ , $E_{i}$ 's are pairwise disjoint.
• Hence, by extended P3 and independence of $X$  and $Y$ ,
$\mathbb {P} (X+Y=n)=\mathbb {P} (X=0)\mathbb {P} (Y=n)+\mathbb {P} (X=1)\mathbb {P} (Y=n-1)+\dotsb +\mathbb {P} (X=n)\mathbb {P} (Y=0).$

• The result follows by definition.

$\Box$

Example. We roll a fair six-faced dice twice (independently). Then, the probability for the sum of the numbers coming up to be 7 is $\underbrace {(1/6)(1/6)+\dotsb +(1/6)(1/6)} _{6{\text{ times}}}=1/6$ .

Proof. Let $X$  and $Y$  be the first and second number coming up respectively. The desired probability is

$\mathbb {P} (X+Y=7)=\underbrace {\mathbb {P} (X=1)} _{1/6}\underbrace {\mathbb {P} (Y=6)} _{1/6}+\dotsb +\underbrace {\mathbb {P} (X=6)} _{1/6}\underbrace {\mathbb {P} (Y=1)} _{1/6}=\underbrace {(1/6)(1/6)+\dotsb +(1/6)(1/6)} _{6{\text{ times}}}=1/6.$

$\Box$

Exercise.

1 Calculate the probability for the sum to be 6 instead.

 1/12 1/6 5/36 7/36 4/9

2 The probability for the sum to be $k$  is 0. Which of the following is (are) possible value(s) of $k$ ?

 1 2 3 12 13

3 Suppose the dice is loaded such that the probability for the number coming up to be 6 is now $1/2$ , and for other numbers, they are equally likely to be coming up. Calculate the probability for the sum to be 7 now.

 0.1 0.101 0.1001 0.1 0.167

Proposition. (Sum of independent Poisson r.v.'s) If $X_{1}\sim \operatorname {Pois} (\lambda _{1}),\dotsc ,X_{n}\sim \operatorname {Pois} (\lambda _{n})$  and $X_{1},\dotsc ,X_{n}$  are independent, then $X_{1}+\dotsb +X_{n}\sim \operatorname {Pois} (\lambda _{1}+\dotsb +\lambda _{n})$ .

Proof.

• The pmf of $X_{1}+X_{2}$  is

{\begin{aligned}f_{X_{1}+X_{2}}(a)&=\sum _{k=0}^{n}{\frac {e^{-\lambda _{1}}\lambda _{1}^{k}}{k!}}\cdot {\frac {e^{-\lambda _{2}}\lambda _{2}^{n-k}}{(n-k)!}}&{\text{by the proposition about convolution of pmf's}}\\&=e^{-\lambda _{1}-\lambda _{2}}\sum _{k=0}^{n}{\frac {\lambda _{1}^{k}\cdot \lambda _{2}^{n-k}}{k!(n-k)!}}\\&={\frac {e^{-(\lambda _{1}+\lambda _{2})}}{n!}}\underbrace {\sum _{k=0}^{n}{\frac {n!}{k!(n-k)!}}\cdot \lambda _{1}^{k}\cdot \lambda _{2}^{n-k}} _{=(\lambda _{1}+\lambda _{2})^{n}}&{\text{ by binomial theorem}}.\\\end{aligned}}

• This pmf as the pmf of $\operatorname {Pois} (\lambda _{1}+\lambda _{2})$ , and so $X_{1}+X_{2}\sim \operatorname {Pois} (\lambda _{1}+\lambda _{2})$ .
• We can extend this result to $n$  Poisson r.v.'s by induction.

$\Box$

Example. There are two service counters, for which the first one receives $X\sim \operatorname {Pois} (3)$  enquiries per hour, while the second one receives $Y\sim \operatorname {Pois} (4)$  enquiries per hour. Given that $X$  and $Y$  are independent, the number of enquiries received by the two counters per hour follows $\operatorname {Pois} (3+4)=\operatorname {Pois} (7)$ .

Proof.

• The number of enquiries received by the two counters per hour is $X+Y$ .
• Then, the result follows from the proposition about sum of Poisson r.v.'s.

$\Box$

Exercise.

Which distribution does the number of enquiries received by the first counter for two hours follow?

 $\operatorname {Pois} (3)$ $\operatorname {Pois} (4)$ $\operatorname {Pois} (6)$ $\operatorname {Pois} (7)$ $\operatorname {Pois} (8)$ ### Order statistics

Definition. (Order statistics) Let $X_{1},\dotsc ,X_{n}$  be $n$  i.i.d. r.v.'s (each with cdf $F(x)$ ). Define $X_{(1)},X_{(2)},\dotsc ,X_{(n)}$  be the smallest, second smallest, ..., largest of $X_{1},X_{2},\dotsc ,X_{n}$ . Then, the ordered values $X_{(1)}\leq X_{(2)}\leq \dotsb \leq X_{(n)}$  is the order statistics.

Proposition. (Cdf of order statistics) The cdf of $X_{(k)}$  ($k$  is an integer such that $1\leq k\leq n$ ) is

$F_{X_{(k)}}({\color {blue}x})=\sum _{j=k}^{n}{\binom {n}{j}}(F({\color {blue}x}))^{j}{\big (}1-F({\color {blue}x}){\big )}^{n-j}.$

Proof.

• Consider the event $\{X_{(k)}\leq {\color {blue}x}\}$ .
                          Possible positions of x
|<--------------------->
*---*----...------*----*------...--------*
X  (1)  (2)          (k)  (k+1)             (n)
|----------------------> when x moves RHS like this, >=k X_i are at the LHS of x

• We can see from the above figure that $\{X_{(k)}\leq {\color {blue}x}\}=\{{\text{at least }}k{\text{ of the }}X_{i}{\text{'s are }}\leq {\color {blue}x}\}$ .
• Let no. of $X_{i}$ 's that are less than or equal to ${\color {blue}x}$  be $N$ .
• Since $N\sim \operatorname {Binom} (n,\mathbb {P} (X_{i}\leq {\color {blue}x})){\overset {\text{ def }}{=}}\operatorname {Binom} (n,F({\color {blue}x}))$  (because for each $X_{i}$ , we can treat $X_{i}\leq x$  and $X_{i}>x$  be the two outcomes in a Bernoulli trial),
• The cdf is

$\mathbb {P} (X_{(k)}\leq {\color {blue}x})=\mathbb {P} (N\geq k)=\sum _{j=k}^{n}{\binom {n}{j}}(F({\color {blue}x}))^{j}{\big (}1-F({\color {blue}x}){\big )}.$

$\Box$

Example. Let $X_{1},X_{2},X_{3}$  be i.i.d. r.v.'s following $\operatorname {Exp} (2)$ . Then, the cdf of $X_{(2)}$  is

$\sum _{j=2}^{3}{\binom {3}{j}}(F(x))^{j}(1-F(x))^{3-j}=\mathbf {1} \{x\geq 0\}\left({\binom {3}{2}}(1-e^{-2x})^{2}(e^{-2x})+{\binom {3}{3}}(1-e^{-2x})^{3}\right)=\mathbf {1} \{x\geq 0\}\left(3(1-e^{-2x})^{2}(e^{2x})+(1-e^{-2x})^{3}\right).$

Exercise.

Calculate $\mathbb {P} (X_{(2)}\geq 2)$ .

 0.000665 0.000994 0.036296 0.963704 0.999335

## Poisson process

Definition. Illustration of Poisson process. Each circle indicates one arrival. The arrivals occur at common rate λ {\displaystyle \lambda }  , and the successive interarrivial times are independent.

If successive interarrival times of unpredictable events are independent random variables, with each following an exponential distribution with a common rate $\lambda$ , then the process of arrivals is a Poisson process with rate $\lambda$ .

There are several important properties for Poisson process.

Proposition. (Time to $n$ -th event in Poisson process) The time to $n$ -th event in a Poisson process follows the $\operatorname {Gamma} (n,\lambda )$  distribution.

Proof.

• The time to $n$ -th event is $X_{1}+\dotsb +X_{n}$ , with each following $\operatorname {Exp} (\lambda )$ .
• It suffices to prove that $X_{1}+X_{2}\sim \operatorname {Gamma} (2,\lambda )$ , and then the desired result follows by induction.
• {\begin{aligned}f_{X_{1}+X_{2}}(z)&=\lambda ^{2}\int _{-\infty }^{\infty }\mathbf {1} \{\underbrace {z-x\geq 0} _{x\leq z}\}\mathbf {1} \{x\geq 0\}e^{-\lambda (z-x)}e^{-\lambda x}\,dx&{\text{by proposition about convolution of pdf's}}\\&=\lambda ^{2}\int _{0}^{z}e^{-\lambda (z{\cancel {-x}}){\cancel {-\lambda x}}}\,dx\\&=\lambda ^{2}\int _{0}^{z}e^{-\lambda z}\,dx\\&=\lambda ^{2}ze^{-\lambda z}\\&={\frac {\lambda ^{2}ze^{-\lambda z}}{\Gamma (2)}}&{\text{since }}\Gamma (2)=1!=1,\end{aligned}}

which is the pdf of $\Gamma (2,\lambda )$ , as desired.

$\Box$

Remark. The time to $n$ -th event is also the sum of the $n$  successive interarrival times before the $n$ -th event.

Proposition. (Number of arrivals within a fixed time interval) The number of arrivals within a fixed time interval of length $t$  follows the $\operatorname {Pois} (\lambda t)$  distribution.

Proof. For each nonnegative integer $n$ , let $V$  be the interarrival time between the $n$ -th and $n+1$ -th arrival, and $W$  be the time to $n$ th event, starting from the beginning of the fixed time interval (we can treat the start to be time zero because of the memoryless property). The joint pdf of $(V,W)$  is

{\begin{aligned}f(v,w)&=f_{V}(v)f_{W}(w)&{\text{by independence}}\\&=\underbrace {(\lambda e^{-\lambda v})} _{{\text{pdf of}}\;\operatorname {Exp} (\lambda )}\underbrace {\left({\frac {\lambda ^{n}w^{n-1}e^{-\lambda w}}{(n-1)!}}\right)} _{{\text{pdf of}}\operatorname {Gamma} (n,\lambda )}.\end{aligned}}

Let $N$  the number of arrivals within the fixed time interval. The pmf of $N$  is
{\begin{aligned}\mathbb {P} (N=n)&=\mathbb {P} (W\leq t\cap \underbrace {V+W>t} _{V>t-W})\\&=\int _{0}^{t}\int _{t-w}^{\infty }\underbrace {f(v,w)} _{{\text{joint pdf of}}\;(V,W)}\,dv\,dw\\&=\int _{0}^{t}\int _{t-w}^{\infty }(\lambda e^{-\lambda v})\left({\frac {\lambda ^{n}w^{n-1}e^{-\lambda w}}{(n-1)!}}\right)\,dv\,dw\\&=\int _{0}^{t}{\frac {\lambda ^{n}w^{n-1}e^{-\lambda w}}{(n-1)!}}\int _{t-w}^{\infty }\lambda e^{-\lambda v}\,dv\,dw\\&={\frac {\lambda ^{n}}{(n-1)!}}\int _{0}^{t}w^{n-1}{\cancel {e^{-\lambda w}}}(0-(-e^{-\lambda (t{\cancel {-w}})}))\,dw\\&={\frac {\lambda ^{n}{\color {green}e^{-\lambda t}}}{(n-1)!}}\int _{0}^{t}w^{n-1}\,dw\\&={\frac {\lambda ^{n}e^{-\lambda t}}{(n-1)!}}\cdot \left({\frac {t^{n}}{n}}-0\right)\\&={\frac {e^{-\lambda t}(\lambda t)^{n}}{n!}}\end{aligned}}

which is the pmf of $\operatorname {Pois} (\lambda t)$ . The result follows.

$\Box$

Proposition. (Time to the first arrival with $n$  independent Poisson processes) Let $T_{1},T_{2},\dotsc ,T_{n}$  be independent random variables with $T_{i}\sim \operatorname {Exp} (\lambda _{i})$ , in which $i=1,2,\dotsc ,n$ . If we define $T=\min\{T_{1},\dotsc ,T_{n}\}$  (which is the time to the first arrival with $n$  independent Poisson processes), then $T\sim \operatorname {Exp} (\lambda _{1}+\lambda _{2}+\cdots +\lambda _{n})$ .

Proof. For each $t>0$ ,

{\begin{aligned}&&\mathbb {P} (T>t)&=\mathbb {P} (T_{1}>t\cap \cdots \cap T_{n}>t)\\&&&=\mathbb {P} (T_{1}>t)\cdots \mathbb {P} (T_{n}>t)&{\text{by independence}}\\&&&=[1-(\underbrace {1-e^{-\lambda _{1}t}} _{{\text{cdf of}}\;\operatorname {Exp} (\lambda _{1})})]\cdots [1-(\underbrace {1-e^{-\lambda _{n}t}} _{{\text{cdf of}}\;\operatorname {Exp} (\lambda _{n})})]\\&&&=e^{-t(\lambda _{1}+\cdots +\lambda _{n})}\\&\Rightarrow &\mathbb {P} (T\leq t)&=1-e^{-t(\lambda _{1}+\cdots +\lambda _{n})}\\&\Rightarrow &T&\sim \operatorname {Exp} (\lambda _{1}+\lambda _{2}+\cdots +\lambda _{n})\end{aligned}}

$\Box$

Example. Suppose there are two service counters, counter A and B, with independent service times following the exponential distribution with rate $\lambda$ . In the past 10 minutes, John and Peter are being served at counter A and B respectively.

First, the time you need to wait to be served (i.e. the time for one of John and Peter leaves the counter) is the minimum value of the service time for John and Peter counting from now, which are independent and follow the exponential distribution with rate $\lambda$ . Thus, your waiting time follows the exponential distribution with rate $\lambda +\lambda =2\lambda$ .

Suppose now John leaves the counter A, and you are currently being served at counter A. Then, the probability that you leave the counter first is $1/2$ , by memoryless property and symmetry (the chances that Peter and you leave the counter first are governed by the same chance mechanism), counterintuitively.

Exercise. Suppose the process of arrivals of car accidents is a Poisson process with unit rate. Let $T_{i}$  be the time to the $i$ -th car accidents, and $X_{i}$  be the interarrival time between the $i-1$ -th and $i$ -th accidents.

1 Which of the following is (are) true?

 $T_{3}\sim \operatorname {Gamma} (3,1)$ $T_{3}\sim \operatorname {Exp} (1)$ $T_{3}\sim \operatorname {Exp} (3)$ $T_{3}\sim \operatorname {Pois} (1)$ $T_{3}\sim \operatorname {Pois} (3)$ 2 Which of the following is (are) true?

 $X_{i}\sim \operatorname {Exp} (i)$ $X_{i}\sim \operatorname {Exp} (1)$ $X_{i}\sim \operatorname {Pois} (1)$ $X_{i}-X_{i-1}\sim \operatorname {Exp} (1)$ 3 Which of the following is (are) true?

 $T_{i}-T_{i-1}\sim \operatorname {Exp} (1)$ $T_{i}-T_{i-1}\sim \operatorname {Gamma} (1,1)$ $T_{i}-T_{i-1}\sim \operatorname {Pois} (1)$ The pmf of the number of arrivals within a fixed time interval of length $t$ is $f(x)={\frac {e^{-t}t^{x}}{x!}}$ .