# Statistics/Distributions/Gamma

### Gamma Distribution

Parameters Probability density function Cumulative distribution function ${\displaystyle \scriptstyle k\;>\;0}$  shape${\displaystyle \scriptstyle \theta \;>\;0\,}$  scale ${\displaystyle \scriptstyle x\;\in \;(0,\,\infty )\!}$ ${\displaystyle \scriptstyle {\frac {1}{\Gamma (k)\theta ^{k}}}x^{k\,-\,1}e^{-{\frac {x}{\theta }}}\,\!}$ ${\displaystyle \scriptstyle {\frac {1}{\Gamma (k)}}\gamma \left(k,\,{\frac {x}{\theta }}\right)\!}$ ${\displaystyle \scriptstyle \operatorname {E} [X]=k\theta \!}$ ${\displaystyle \scriptstyle \operatorname {E} [\ln X]=\psi (k)+\ln(\theta )\!}$ (see digamma function) No simple closed form ${\displaystyle \scriptstyle (k\,-\,1)\theta {\text{ for }}k\;>\;1\,\!}$ ${\displaystyle \scriptstyle \operatorname {Var} [X]=k\theta ^{2}\,\!}$ ${\displaystyle \scriptstyle \operatorname {Var} [\ln X]=\psi _{1}(k)\!}$ (see trigamma function ) ${\displaystyle \scriptstyle {\frac {2}{\sqrt {k}}}\,\!}$ ${\displaystyle \scriptstyle {\frac {6}{k}}\,\!}$ {\displaystyle \scriptstyle {\begin{aligned}\scriptstyle k&\scriptstyle \,+\,\ln \theta \,+\,\ln[\Gamma (k)]\\\scriptstyle &\scriptstyle \,+\,(1\,-\,k)\psi (k)\end{aligned}}}

The Gamma distribution is very important for technical reasons, since it is the parent of the exponential distribution and can explain many other distributions.

The probability distribution function is:

${\displaystyle f_{x}(x)={\begin{cases}{\frac {1}{a^{p}\Gamma (p)}}x^{p-1}e^{-x/a},&{\mbox{if }}x\geq 0\\0,&{\mbox{if }}x<0\end{cases}}\quad a,p>0}$

Where ${\displaystyle \Gamma (p)=\int _{0}^{\infty }t^{p-1}e^{-t}\,dt\,}$  is the Gamma function. The cumulative distribution function cannot be found unless p=1, in which case the Gamma distribution becomes the exponential distribution. The Gamma distribution of the stochastic variable X is denoted as ${\displaystyle X\in \Gamma (p,a)}$ .

Alternatively, the gamma distribution can be parameterized in terms of a shape parameter ${\displaystyle \alpha =k}$  and an inverse scale parameter ${\displaystyle \beta =1/\theta }$ , called a rate parameter:

${\displaystyle g(x;\alpha ,\beta )=Kx^{\alpha -1}e^{-\beta \,x}\ \mathrm {for} \ x>0\,\!.}$

where the ${\displaystyle K}$  constant can be calculated setting the integral of the density function as 1:

${\displaystyle \int _{-\infty }^{+\infty }g(x;\alpha ,\beta )\mathrm {d} t\,=\int _{0}^{+\infty }Kx^{\alpha -1}e^{-\beta \,x}\mathrm {d} x\,=1}$

following:

${\displaystyle K\int _{0}^{+\infty }x^{\alpha -1}e^{-\beta \,x}\mathrm {d} x\,=1}$
${\displaystyle K={\frac {1}{\int _{0}^{+\infty }x^{\alpha -1}e^{-\beta \,x}\mathrm {d} x}}}$

and, with change of variable ${\displaystyle y=\beta x}$  :

{\displaystyle {\begin{aligned}K&={\frac {1}{\int _{0}^{+\infty }{\frac {y^{\alpha -1}}{\beta ^{\alpha -1}}}e^{-y}{\frac {\mathrm {d} y}{\beta }}}}\\&={\frac {1}{{\frac {1}{\beta ^{\alpha }}}\int _{0}^{+\infty }y^{\alpha -1}e^{-y}\mathrm {d} y}}\\&={\frac {\beta ^{\alpha }}{\int _{0}^{+\infty }y^{\alpha -1}e^{-y}\mathrm {d} y}}\\&={\frac {\beta ^{\alpha }}{\Gamma (\alpha )}}\end{aligned}}}

following:

${\displaystyle g(x;\alpha ,\beta )=x^{\alpha -1}{\frac {\beta ^{-\alpha }\,e^{-\beta \,x}}{\Gamma (\alpha )}}\ \mathrm {for} \ x>0\,\!.}$

#### Probability Density Function

We first check that the total integral of the probability density function is 1.

${\displaystyle \int _{-\infty }^{\infty }{\frac {1}{a^{p}\Gamma (p)}}x^{p-1}e^{-x/a}dx}$

Now we let y=x/a which means that dy=dx/a

${\displaystyle {\frac {1}{\Gamma (p)}}\int _{0}^{\infty }y^{p-1}e^{-y}dy}$
${\displaystyle {\frac {1}{\Gamma (p)}}\Gamma (p)=1}$

#### Mean

${\displaystyle \operatorname {E} [X]=\int _{-\infty }^{\infty }x\cdot {\frac {1}{a^{p}\Gamma (p)}}x^{p-1}e^{-x/a}dx}$

Now we let y=x/a which means that dy=dx/a.

${\displaystyle \operatorname {E} [X]=\int _{0}^{\infty }ay\cdot {\frac {1}{\Gamma (p)}}y^{p-1}e^{-y}dy}$
${\displaystyle \operatorname {E} [X]={\frac {a}{\Gamma (p)}}\int _{0}^{\infty }y^{p}e^{-y}dy}$
${\displaystyle \operatorname {E} [X]={\frac {a}{\Gamma (p)}}\Gamma (p+1)}$

We now use the fact that ${\displaystyle \Gamma (z+1)=z\Gamma (z)}$

${\displaystyle \operatorname {E} [X]={\frac {a}{\Gamma (p)}}p\Gamma (p)=ap}$

#### Variance

We first calculate E[X^2]

${\displaystyle \operatorname {E} [X^{2}]=\int _{-\infty }^{\infty }x^{2}\cdot {\frac {1}{a^{p}\Gamma (p)}}x^{p-1}e^{-x/a}dx}$

Now we let y=x/a which means that dy=dx/a.

${\displaystyle \operatorname {E} [X^{2}]=\int _{0}^{\infty }a^{2}y^{2}\cdot {\frac {1}{a\Gamma (p)}}y^{p-1}e^{-y}ady}$
${\displaystyle \operatorname {E} [X^{2}]={\frac {a^{2}}{\Gamma (p)}}\int _{0}^{\infty }y^{p+1}e^{-y}dy}$
${\displaystyle \operatorname {E} [X^{2}]={\frac {a^{2}}{\Gamma (p)}}\Gamma (p+2)=pa^{2}(p+1)}$

Now we use calculate the variance

${\displaystyle \operatorname {Var} (X)=\operatorname {E} [X^{2}]-(\operatorname {E} [X])^{2}}$
${\displaystyle \operatorname {Var} (X)=pa^{2}(p+1)-(ap)^{2}=pa^{2}}$