2 min read

The Gamma random variable denotes the waiting time for a Poisson event also the sum of Exponential events

The Gamma random variable denotes the waiting time for the \(r^{th}\) Poisson event, and also denotes the sum of r Exponential random variables. The sum of m Gamma random variables (shared the same parameter \(\lambda\)) is a Gamma random variable, which denotes the waiting time for the \((\sum_{i=1}^{m} r_i)^{th}\) Poisson event, and also denotes the sum of \(\sum_{i=1}^{m} r_i\) Exponential random variables.

let Y denote the waiting time to the occurrence of the \(r^{th}\) Poisson event, the probability fewer than r Poisson events occur in [0, y] time period is \(P(Y>y)=\sum_{k=0}^{r-1}e^{-\lambda y}\frac{(\lambda y)^k}{k!}\), then, the probability of the \(r^{th}\) occurrence in [0, y] time period is \[\begin{align} F_Y(y)=P(Y\le y)&=1-P(Y>y)\\ &=1-\sum_{k=0}^{r-1}e^{-\lambda y}\frac{(\lambda y)^k}{k!} \end{align}\]

Then, \[\begin{align} f_Y(y)=\frac{d}{dy}F_Y(y)&=\frac{d}{dy}\Biggl(1-\sum_{k=0}^{r-1}e^{-\lambda y}\frac{(\lambda y)^k}{k!}\Biggr)\\ &=\sum_{k=0}^{r-1}\lambda e^{-\lambda y}\frac{(\lambda y)^k}{k!}-\sum_{k=1}^{r-1}\lambda e^{-\lambda y}\frac{(\lambda y)^{k-1}}{(k-1)!}\\ &=\sum_{k=0}^{r-1}\lambda e^{-\lambda y}\frac{(\lambda y)^k}{k!}-\sum_{k=0}^{r-2}\lambda e^{-\lambda y}\frac{(\lambda y)^k}{k!}\\ &=\lambda e^{-\lambda y}\frac{(\lambda y)^{r-1}}{(r-1)!}\\ &=\frac{\lambda^r}{(r-1)!}y^{r-1}e^{-\lambda y}\\ &=\frac{\lambda^r}{\Gamma(r)}y^{r-1}e^{-\lambda y}, \quad y>0 \end{align}\], this is the Gamma distribution.

The waiting time for the first Poisson event \(S_1\) is: let \(r=1\), \[f_Y(y)=\lambda e^{-\lambda y}, \quad y\ge 0\] which is the density of Exponential Distribution. Then the probability that the waiting time for the first Poisson event \(S_1\) is longer than time \(t\) is \[\mathbb P(S_1 > t)=\int_{t}^{\infty}\lambda e^{-\lambda y}dy=-e^{-\lambda y}\big |_{t}^{\infty}=-0+e^{-\lambda t}=e^{-\lambda t}\] and the CDF for \(S_1\) is \[\mathbb P(S_1 \le t)=\int_{0}^{t}\lambda e^{-\lambda y}dy=-e^{-\lambda y}\big |_{0}^{t}=1-e^{-\lambda t}\]

The waiting time for the \(i\)th Poisson event \(S_i\) also follows Exponential Distribution \[f_Y(y)=\lambda e^{-\lambda y}, \quad y\ge 0\] which is independent of \(S_{i-1}\).

The moment-generating function for a Gamma random variable Y is: \[\begin{align} M_X(t)=E(e^{tY})&=\int_{0}^{+\infty}e^{ty}\frac{\lambda^r}{\Gamma(r)}y^{r-1}e^{-\lambda y}dy\\ &=\frac{\lambda^r}{(\lambda-t)^r}\int_{0}^{+\infty}\frac{(\lambda-t)^r}{\Gamma(r)}y^{r-1}e^{-(\lambda-t)y}dy\\ &=\frac{\lambda^r}{(\lambda-t)^r}\\ &=(\frac{\lambda}{\lambda-t})^r\\ &=(1-\frac{t}{\lambda})^{-r} \end{align}\]

The mean is \[\begin{align} E(Y)=M_X^{(1)}(t)&=\frac{d}{dt}(1-\frac{t}{\lambda})^{-r}\\ &=\frac{r}{\lambda}(1-\frac{t}{\lambda})^{-r-1}|_{t=0}\\ &=\frac{r}{\lambda} \end{align}\]

\[\begin{align} E(Y^2)=M_X^{(2)}(t)&=\frac{d}{dt}(\frac{r}{\lambda}(1-\frac{t}{\lambda})^{-r-1})\\ &=\frac{r(r+1)}{\lambda^2}(1-\frac{t}{\lambda})^{-r-2}|_{t=0}\\ &=\frac{r(r+1)}{\lambda^2} \end{align}\]

Then, \(Var(Y)=E(Y^2)-(E(Y))^2=\frac{r(r+1)}{\lambda^2}-\frac{r^2}{\lambda^2}=\frac{r}{\lambda^2}\)

The pdf of the sum of m Gamma random variables with parameters \(s_i\) and shared the same parameter \(\lambda\) is: \[\begin{align} f_{\sum_{i=1}^{m} Y_i}(t)&=\frac{e^{(-\lambda \sum u_i)}}{\prod_{i=1}^{m}\Gamma(r_i)}\lambda^{(\sum r_i)}\int_{0}^{t}(\prod u_i^{r_i-1})du \quad (\sum u_i=t)\\ &=\frac{e^{(-\lambda t)}}{\prod_{i=1}^{m}\Gamma(r_i)}\lambda^{(\sum r_i)}t^{\sum (r_i-1)}t^{m-1}\int_{0}^{1}(\prod v_i^{r_i-1})dv\quad (u_i=tv_i, \sum v_i=1)\\ &=\frac{e^{(-\lambda t)}}{\prod_{i=1}^{m}\Gamma(r_i)}\lambda^{(\sum r_i)}t^{(\sum r_i)-1}\frac{\prod_{i=1}^{m}\Gamma(r_i)}{\Gamma(\sum r_i)}\\ &=\frac{e^{(-\lambda t)}}{\Gamma(\sum r_i)}\lambda^{(\sum r_i)}t^{(\sum r_i)-1} \end{align}\]