1

Problem:

$K_1, K_2,…$ is an independent, identically distributed sequence of Poisson random variables with $E[K] = 1$. $W_n = K_1 +…+ K_n$. Use the approximation:
$P[k_1 \leq K \leq k_2] = \phi{\frac{(k_2+0.5-E[K])}{\sigma_K}}-\phi{\frac{(k_1-0.5-E[K])}{\sigma_K}}$ to estimate $P[W = n]$ for $n = 4, 25, \text{and } 64$. Compare your approximations with the exact value of $P[W = n]$.

Attempt at Solution:

The equation for the probability mass function of a Poisson random variable is

$P_K(k) = \begin{cases} \frac{\alpha^ke^{-\alpha}}{k!}, & k = 0, 1, 2, ...;\\ 0, & \text{otherwise.} \end{cases}$

I know that for a Poisson random variable, $E[K] = Var[K] = \alpha$, so in this case, $\alpha = E[K] = Var[K] = \sigma_K = 1$. Further, by substitution, the probability mass function for any $K_i$ in the sequence is

$P_K(k) = \begin{cases} \frac{1}{e(k!)}, & k = 0, 1, 2, ...;\\ 0, & \text{otherwise.} \end{cases}$

This is the point at which I am unsure of how to go about solving the problem. I have a table of moment generating functions from which I can tell that the moment generating function for any $K_i$ is $M_K(k) = e^{e^s - 1}$, and since all $K_i$ are independent and identically distributed, the moment generating function for $W$ is $M_W(w) = (e^{e^s - 1})^n = e^{n(e^s - 1)}$. However, moment generating functions provide expected values, not probabilities, so I'm not sure if it is of any use for me to have derived this in the first place.

Also, I assume that to find an estimate for $n = 4$ (as an example), one would just substitute 3 for $k_1$ and 5 for $k_2$ in the approximation equation, but this yields $z$-values for $\phi(z)$ that exceed the values given in the standard normal table (which provides numerical values of $\phi(z)$ up to $\phi(3.49)$).

Swamp G
  • 337

1 Answers1

1

Personally I would use $\Phi(x)$ for the cumulative distribution function of a standard normal distribution and $\phi(x)$ for its density function.

You might find it easier if you knew that the sum of independent Poisson random variables is a Poisson random variable with parameter/mean/variance equal to the sum of the original parameters. So $n$ in this case, with $\Pr(W_n=n)=e^{-n}\frac{n^n}{n!}$.

You can see this with moment generating functions. The moment generating function of a Poisson distribution with parameter $\lambda$ is $e^{-\lambda(e^s-1)}$. So if $\lambda=1$ this is as you say $e^{-(e^s-1)}$ and the sum of $n$ i.i.d. examples of this has moment generating function $\left(e^{-(e^s-1)}\right)^n=e^{-n(e^s-1)}$, and so is a Poisson distribution with parameter $n$.

For example, with $n=64$ you need to compare $e^{-64}\frac{64^{64}}{64!}$ with $\Phi\left(\frac{64.5-64}{\sqrt{64}}\right)-\Phi\left(\frac{63.5-64}{\sqrt{64}}\right)$. They are both about $0.0498$. $\frac{1}{\sqrt{64}}\phi(0)$ is also close, and the central limit theorem leads to $e^{-n}\frac{n^n}{n!} \approx \frac{1}{\sqrt{2\pi n}}$ for large $n$, giving an alternative approach to Stirling's approximation to the factorial.

Henry
  • 157,058
  • What is the difference between $\phi(x)$ and $\Phi(x)$? In my probability class, we have only covered one of these (and I believe it is $\phi(x)$), in which $x = \frac{k - E[X]}{\sigma_k}$. We then look up this value of x in the standard normal table, which provides us with $\phi(x)$. – Swamp G Aug 10 '14 at 20:44
  • 1
    If your tables look like this, then the shaded area and the numbers are $\Phi(x)$ (the probability of being less than or equal to $x$) while the height is $\phi(x)$ (the density) illustrated here. You can say $\displaystyle \Phi(x)=\int_{-\infty}^x \phi(t); dt$. – Henry Aug 10 '14 at 20:52
  • Another question, since the Poisson random variable is a discrete one, $k_1$ and $k_2$ need to be integers for the approximation? – Swamp G Aug 10 '14 at 21:15
  • Yes, but then you need to add or subtract $0.5$ in the normal approximation as a continuity correction – Henry Aug 10 '14 at 21:17