4

I was going through a stats book yesterday when I saw the normalisation of the binomial distribution stated. I was wondering if anyone had a proof of that please? $$ \sum^{n}_{r=0}p^{r}(1-p)^{(n-r)}\cdot{^{n}C_{r}}=1. $$

I can see how this simplifies to: $$ n!(1-p)^n\sum_{r=0}^{n}\frac{p^r}{(1-p)^r}\frac{1}{r!(n-r)!}. $$

I've also seen proofs for $$ \sum^{n}_{r=0}{^{n}C_{r}}=2^{n}, $$ but not one for the whole thing. Can anyone help please?

James
  • 163

4 Answers4

3

This is simply the binomial theorem:

$$1 = (p+(1-p))^n = \sum_{k=0}^n \binom{n}{k} p^k (1-p)^{n-k},$$

which you can prove by induction.

Thomas Ahle
  • 4,612
2

To see this just recall the binomial theorem. For $n\in\mathbb{N}_0$ then $$(x+y)^n=\sum\limits_{k=0}^n{n\choose k}x^ky^{n-k}.$$ So here let $x=p$ and $y=1-p$ and your expression will simplify to $$\sum\limits_{r=0}^np^r(1-p)^{n-r}{n\choose r}=(p+(1-p))^n=1^n=1.$$ Hope this helps.

2

In a binomial distribution the probabilities of interest are those of receiving a certain number of successes, r, in n independent trials each having only two possible outcomes and the same probability, p, of success.

Basically, a two part process is involved. First, we have to determine the probability of one possible way the event can occur, and then determine the number of different ways the event can occur.

$$P(Event) = (Number \ of \ ways \ event \ can \ occur) * P(One \ occurrence). $$

Suppose, for example, we want to find the probability of getting $4$ heads in $10$ tosses. In this case, we’ll call getting a heads a “success.” Also, in this case, $n = 10$, the number of successes is $r = 4$, and the number of failures (tails) is $n – r = 10 – 4 = 6$. One way this can occur is if the first $4$ tosses are heads and the last $6$ are tails, i.e.

$$S S S S F F F F F F$$

The likelihood of this occurring is :

$$P(S) * P(S) * P(S) * P(S) * P(F) * P(F) * P(F) * P(F) * P(F) * P(F) $$

More generally, if $p$ = probability of success and the probability of a specific sequence of outcomes where there are $r$ successes and $n-r$ failures is

$$p^r(1-p)^{n-r}$$

Of course, this is just one of many ways that you can get $4$ heads; further, because the repeated trials are all independent and identically distributed, each way of getting $4$ heads is equally likely, e.g. the sequence $S S S S F F F F F F$ is just as likely as the sequence $S F S F F S F F S F$.

So, we also need to know how many different combinations produce $ 4$ heads.

Well, we could just write them all out…but life will be much simpler if we take advantage of two counting rules:

$$$$

1. The number of different ways that N distinct things may be arranged in order is :

$$ N! = (1)(2)(3)...(N-1)(N) $$

2. The total number of ways of selecting r distinct combinations of N objects, irrespective of order, is :

$$\binom{n}{r}$$

We can now write out the complete formula for the binomial distribution:

In sampling from a stationary Bernoulli process, with the probability of success equal to p, the probability of observing exactly r successes in N independent trials is

$$\binom{n}{r} p^r(1-p)^{n-r}$$

Once again, $N$ choose $r$ tells you the number of sequences that will produce $r$ successes in $N$ tries, while $p^r (1-p)^{N-r}$ tells you what the probability of each individual sequence is.

Hope that helps...

Saket Gurjar
  • 1,663
1

Set $(\frac{p}{1-p})=x$ and $\sum_{k=0}^{n}\binom{n}{k}x^k=(1+x)^n$

Alex
  • 19,262