Let
X
1
,
.
.
.
,
X
n
{\displaystyle X_{1},...,X_{n}}
be i.i.d. random variables with moment generating function
M
(
t
)
=
E
[
exp
(
t
X
1
)
]
{\displaystyle M(t)=E[\exp(tX_{1})]}
which is finite for all
t
{\displaystyle t}
. Let
X
~
n
=
(
X
1
+
⋯
+
X
n
)
/
n
{\displaystyle {\tilde {X}}_{n}=(X_{1}+\cdots +X_{n})/n}
.
(a) Prove that
P
[
X
1
>
a
]
≤
e
x
p
[
−
h
(
a
)
]
{\displaystyle P[X_{1}>a]\leq exp[-h(a)]}
where
h
(
a
)
=
sup
t
≥
0
[
a
t
−
ψ
(
t
)
]
{\displaystyle h(a)=\sup _{t\geq 0}[at-\psi (t)]}
and
ψ
(
t
)
=
log
M
(
t
)
{\displaystyle \psi (t)=\log M(t)}
(b) Prove that
P
[
X
~
n
≥
a
]
≤
exp
[
−
n
h
(
a
)
]
{\displaystyle P[{\tilde {X}}_{n}\geq a]\leq \exp[-nh(a)]}
.
(c) Assume
E
[
X
1
]
=
0
{\displaystyle E[X_{1}]=0}
. Use the result of (b) to establish that
X
~
n
→
0
{\displaystyle {\tilde {X}}_{n}\to 0}
almost surely.
(a)
P
[
X
1
>
a
]
=
∫
X
1
>
a
1
d
F
=
∫
X
1
>
a
exp
(
t
a
)
exp
(
t
a
)
d
F
=
e
−
a
t
∫
X
1
>
a
e
a
t
d
F
≤
e
−
a
t
∫
X
1
>
a
e
X
1
t
d
F
≤
e
−
a
t
∫
Ω
e
X
1
t
d
F
=
e
−
a
t
E
[
exp
(
t
X
1
)
]
{\displaystyle {\begin{aligned}P[X_{1}>a]=&\int _{X_{1}>a}1\,dF=\int _{X_{1}>a}{\frac {\exp(ta)}{\exp(ta)}}\,dF\\=&e^{-at}\int _{X_{1}>a}e^{at}\,dF\leq e^{-at}\int _{X_{1}>a}e^{X_{1}t}\,dF\\\leq &e^{-at}\int _{\Omega }e^{X_{1}t}\,dF=e^{-at}E[\exp(tX_{1})]\end{aligned}}}
Thus far, we have not imposed any conditions on
t
{\displaystyle t}
. So the above inequality will hold for all
t
{\displaystyle t}
, hence for the supremum as well, which gives us the desired result.
(b)
P
[
X
~
n
>
a
]
=
∫
X
~
n
>
a
1
d
F
=
∫
X
~
n
>
a
exp
(
n
t
a
)
exp
(
n
t
a
)
d
F
=
e
−
n
a
t
∫
∑
i
=
1
n
X
i
>
a
n
e
n
a
t
d
F
≤
e
−
n
a
t
∫
∑
i
=
1
n
X
i
>
a
n
e
∑
i
=
1
n
X
i
t
d
F
≤
e
−
n
a
t
∫
Ω
e
∑
i
=
1
n
X
i
t
d
F
=
e
−
n
a
t
(
∫
Ω
e
X
i
t
d
F
)
n
{\displaystyle {\begin{aligned}P[{\tilde {X}}_{n}>a]=&\int _{{\tilde {X}}_{n}>a}1\,dF=\int _{{\tilde {X}}_{n}>a}{\frac {\exp(nta)}{\exp(nta)}}\,dF\\=&e^{-nat}\int _{\sum _{i=1}^{n}X_{i}>an}e^{nat}\,dF\leq e^{-nat}\int _{\sum _{i=1}^{n}X_{i}>an}e^{\sum _{i=1}^{n}X_{i}t}\,dF\\\leq &e^{-nat}\int _{\Omega }e^{\sum _{i=1}^{n}X_{i}t}\,dF=e^{-nat}(\int _{\Omega }e^{X_{i}t}\,dF)^{n}\end{aligned}}}
where the last equality follows from the fact that the
X
i
{\displaystyle X_{i}}
are independent and identically distributed.
(c)
Let
N
1
(
t
)
,
N
2
(
t
)
{\displaystyle N_{1}(t),N_{2}(t)}
be independent homogeneous Poisson processes with rates
λ
1
,
λ
2
{\displaystyle \lambda _{1},\lambda _{2}}
, respectively. Let
Z
{\displaystyle Z}
be the time of the first jump for the process
N
1
(
t
)
+
N
2
(
t
)
{\displaystyle N_{1}(t)+N_{2}(t)}
and let
J
{\displaystyle J}
be the random index of the component process that made the first jump. Find the joint distribution of
(
J
,
Z
)
{\displaystyle (J,Z)}
. In particular, establish that
J
,
Z
{\displaystyle J,Z}
are independent and that
Z
{\displaystyle Z}
is exponentially distributed.
Show
Z
{\displaystyle Z}
is exponentially distributed
edit
Let
τ
{\displaystyle \tau }
be the first time that a Poisson process
N
(
t
)
{\displaystyle N(t)}
jumps.
p
τ
(
x
)
=
lim
ϵ
→
0
F
τ
(
x
)
−
F
τ
(
x
−
ϵ
)
ϵ
=
lim
ϵ
→
0
P
(
N
(
x
)
>
0
∩
N
(
x
−
ϵ
)
=
0
)
ϵ
=
lim
ϵ
→
0
1
/
ϵ
P
(
N
(
x
−
ϵ
)
=
0
)
⋅
P
(
N
(
x
)
−
N
(
x
−
ϵ
)
>
0
)
=
lim
ϵ
→
0
1
/
ϵ
e
−
λ
(
x
−
ϵ
)
⋅
λ
ϵ
1
e
−
λ
ϵ
=
λ
e
−
λ
x
{\displaystyle {\begin{aligned}p_{\tau }(x)=\lim _{\epsilon \to 0}{\frac {F_{\tau }(x)-F_{\tau }(x-\epsilon )}{\epsilon }}&=\lim _{\epsilon \to 0}{\frac {P(N(x)>0\cap N(x-\epsilon )=0)}{\epsilon }}\\=&\lim _{\epsilon \to 0}1/\epsilon \,P(N(x-\epsilon )=0)\cdot P(N(x)-N(x-\epsilon )>0)\\=&\lim _{\epsilon \to 0}1/\epsilon \,e^{-\lambda (x-\epsilon )}\cdot {\frac {\lambda \epsilon }{1}}e^{-\lambda \epsilon }=\lambda e^{-\lambda x}\end{aligned}}}
N
1
(
t
)
+
N
2
(
t
)
{\displaystyle N_{1}(t)+N_{2}(t)}
is a Poisson Process with parameter
λ
1
+
λ
2
{\displaystyle \lambda _{1}+\lambda _{2}}
edit
Proof: There are three conditions to check:
(i)
N
1
(
0
)
+
N
2
(
0
)
=
0
{\displaystyle N_{1}(0)+N_{2}(0)=0}
almost surely
(ii) For
t
>
s
{\displaystyle t>s}
is
(
N
1
(
t
)
+
N
2
(
t
)
−
N
1
(
s
)
−
N
2
(
s
)
{\displaystyle (N_{1}(t)+N_{2}(t)-N_{1}(s)-N_{2}(s)}
independent of
N
1
(
s
)
+
N
2
(
s
)
{\displaystyle N_{1}(s)+N_{2}(s)}
? This is true since both
N
1
,
N
2
{\displaystyle N_{1},N_{2}}
are Poisson Processes and are both independent of each other.
(iii) For
t
>
s
{\displaystyle t>s}
is
(
N
1
(
t
)
+
N
2
(
t
)
−
N
1
(
s
)
−
N
2
(
s
)
{\displaystyle (N_{1}(t)+N_{2}(t)-N_{1}(s)-N_{2}(s)}
distributed Poisson with parameter
t
−
s
{\displaystyle t-s}
?
This is true since the sum of independent Poisson processes are also poison. (see second bullet )
Joint distribution of (J,Z)
edit
P
(
J
=
1
,
Z
=
x
)
=
λ
1
e
−
λ
1
x
{\displaystyle P(J=1,Z=x)=\lambda _{1}e^{-\lambda _{1}x}}
P
(
J
=
2
,
Z
=
x
)
=
λ
2
e
−
λ
2
x
{\displaystyle P(J=2,Z=x)=\lambda _{2}e^{-\lambda _{2}x}}
Consider the following process
{
X
n
}
{\displaystyle \{X_{n}\}}
taking values in
{
0
,
1
,
.
.
.
}
{\displaystyle \{0,1,...\}}
. Assume
U
n
,
n
=
1
,
2
,
.
.
.
{\displaystyle U_{n},n=1,2,...}
is an i.i.d. sequence of positive integer valued random variables and let
X
0
{\displaystyle X_{0}}
be independent of the
U
n
{\displaystyle U_{n}}
. Then
X
n
=
{
X
n
−
1
−
1
if
X
n
−
1
≠
0
U
k
−
1
if
X
n
−
1
=
0
for the kth time
{\displaystyle X_{n}=\left\{{\begin{array}{l l}X_{n-1}-1&{\text{if }}X_{n-1}\neq 0\\U_{k}-1&{\text{if }}X_{n-1}=0{\text{ for the kth time}}\end{array}}\right.}