Processing math: 100%

Binomial Distribution

From LNTwww

General description of the binomial distribution


Definition:  The  »binomial distribution«  represents an important special case for the occurrence probabilities of a discrete random variable.

To derive the binomial distribution,  we assume that  I  binary and statistically independent random variables  bi  each can achieve

  • the value  1  with probability  Pr(bi=1)=p,  and
  • the value  0  with probability  Pr(bi=0)=1p.


Then the sum  z  is also a discrete random variable with the symbol set   {0, 1, 2, ..., I},  which is called binomially distributed:

z=Ii=1bi.

Thus,  the symbol set size is  M=I+1.


Example 1:  The binomial distribution finds manifold applications in Communications Engineering as well as in other disciplines:

  1.   It describes the distribution of rejects in statistical quality control.
  2.   It allows the calculation of the residual error probability in blockwise coding.
  3.  The bit error rate of a digital transmission system obtained by simulation is actually a binomially distributed random quantity.

Probabilities of the binomial distribution


Calculation rule:  For the  »probabilities of the binomial distribution«  with  μ=0,..., I:

pμ=Pr(z=μ)=(Iμ)pμ(1p)Iμ.

The first term here indicates the number of combinations   (read:  I  over  μ):

(Iμ)=I!μ!(Iμ)!=I(I1)  (Iμ+1)12  μ.


Additional notes:

  1. For very large values of  I,  the binomial distribution can be approximated by the  »Poisson distribution«  described in the next section.
  2. If at the same time the product  I·p1,  then according to  »de Moivre–Laplace's (central limit) theorem«,  the Poisson distribution  (and hence the binomial distribution)  transitions to a discrete  »Gaussian distribution«.


Binomial distribution probabilities

Example 2:  The graph shows the probabilities of the binomial distribution for  I=6  and  p=0.4

  • Thus  M=I+1=7  probabilities are different from zero.
  • In contrast,  for  I=6  and  p=0.5,  the probabilities of the binomial distribution are as follows:
Pr(z=0)=Pr(z=6)=1/64=0.015625,Pr(z=1)=Pr(z=5)=6/64=0.09375,Pr(z=2)=Pr(z=4)=15/64=0.234375,Pr(z=3)=20/64=0.3125.
  • These are symmetrical with respect to the abscissa value  μ=I/2=3.


Example 3:  Another example of the application of the binomial distribution is the  »calculation of the block error probability in Digital Signal Transmission«.

If one transmits blocks each of  I=10  binary symbols over a channel

  • with probability  p=0.01  that one symbol is falsified   ⇒   random variable  ei=1,  and
  • correspondingly with probability  1p=0.99  for an unfalsified symbol   ⇒   random variable  ei=0,


then the new random variable  f   ⇒   »number of block errors«  is:

f=Ii=1ei.

This random variable  f  can take all integer values between  0  (no symbol is falsified)  and  I  (all symbols symbol are falsified) . 

  1. We denote the probabilities for  μ  falsifications by  pμ.
  2. The case where all  I  symbols are correctly transmitted occurs with probability  p0=0.99100.9044 .
  3. This also follows from the binomial formula for  μ=0  considering the definition  10 over 0=1.
  4. A single symbol error  (f=1)  occurs with the probability  p1=100.010.9990.0914.
  5. The first factor considers that there are exactly  10 over 1=10  possibilities for the position of a single error. 
  6. The other two factors take into account that one symbol must be falsified and nine must be transmitted correctly if  f=1  is to hold.


For  f=2  there are clearly more combinations,  namely  10 over 2=45,   and we get

p2=450.0120.9980.0041.

If a block code can correct up to two errors,  the  »block error probability«  is

pblock=p3+ ...+p10104,

or

pblock=1p0p1p2104.
  • One can see that for large  I  values the second possibility of calculation via the complement leads faster to the goal.
  • However,  one could also consider that for these numerical values  pblockp3  holds as an approximation.


⇒   Use the interactive HTML5/JS applet  »Binomial and Poisson distribution«  to find the binomial probabilities for any  I  and  p .


Moments of the binomial distribution


You can calculate the moments in general using the equations in the chapters 


Calculation rules:  For the  »k-th order moment«  of a binomially distributed random variable,  the general rule is:

mk=E[zk]=Iμ=0μk(Iμ)pμ(1p)Iμ.

From this,  we obtain for after some transformations

  • the first order moment   ⇒   »linear mean«:
m1=E[z]=Ip,
  • the second order moment   ⇒   »power«:
m2=E[z2]=(I2I)p2+Ip,
  • the variance by applying »Steiner's theorem»:
σ2=m2m21=Ip(1p),
  • the standard deviation:
σ=Ip(1p).


The maximum variance  σ2=I/4  is obtained for the  »characteristic probability«  p=1/2

  • In this case,  the binomial probabilities are symmetric around the mean  m1=I/2  pμ=pIμ.
  • The more the characteristic probability  p  deviates from the value  1/2 ,
  1.   the smaller is the standard deviation  σ,  and
  2.   the more asymmetric become the probabilities around the mean  m1=I·p.


Example 4:  

As in  Example 3,  we consider a block of  I=10  binary symbols,  each of which is independently falsified with probability  p=0.01 .  Then holds:

  • The mean number of block errors is equal to  mf=E[f]=I·p=0.1.
  • The standard deviation of the random variable  f  is  σf=0.10.990.315.


In contrast,  in the completely falsified channel   ⇒   bit error probability  p=1/2  results in the values

  • mf=5   ⇒   on average,  five of the ten bits within a block are wrong,
  • σf=I/21.581   ⇒   maximum standard deviation.

Exercises for the chapter


Exercise 2.3: Algebraic Sum of Binary Numbers

Exercise 2.4: Number Lottery (6 from 49)