Difference between revisions of "Applets:Binomial and Poisson Distribution (Applet)"

From LNTwww
 
(36 intermediate revisions by 5 users not shown)
Line 1: Line 1:
 +
{{LntAppletLinkEn|binomPoissonDistributions_en}}         [https://www.lntwww.de/Applets:Binomial-_und_Poissonverteilung_(Applet) '''English Applet with German WIKI description''']
 +
 +
 
==Applet Description==
 
==Applet Description==
<br>
+
 
This applet allows the calculation and graphic display of  
+
This applet allows the calculation and graphical display of  
*the probabilities ${\rm Pr}(z=\mu)$ of a discrete random variable $z \in \{\mu \} =  \{0, 1, 2, 3, \text{...} \}$, that determine its ''Probability Density Function'' (PDF) (graphic representation with Dirac functions ${\rm \delta}( z-\mu)$):
+
*the probabilities ${\rm Pr}(z=\mu)$ of a discrete random variable $z \in \{\mu \} =  \{0, 1, 2, 3, \text{...} \}$, that determine its ''Probability Density Function'' (PDF) &ndash; here representation with Dirac delta functions ${\rm \delta}( z-\mu)$:
 
:$$f_{z}(z)=\sum_{\mu=1}^{M}{\rm Pr}(z=\mu)\cdot {\rm \delta}( z-\mu),$$
 
:$$f_{z}(z)=\sum_{\mu=1}^{M}{\rm Pr}(z=\mu)\cdot {\rm \delta}( z-\mu),$$
*the probabilities ${\rm Pr}(z \le \mu)$ of the Cumulative Distribution Function (CDF) $F_{z}(\mu)={\rm Pr}(z\le\mu)$.
+
*the probabilities ${\rm Pr}(z \le \mu)$ of the ''Cumulative Distribution Function'' (CDF):
 +
:$$F_{z}(\mu)={\rm Pr}(z\le\mu).$$
  
  
Line 13: Line 17:
  
  
At the exercises below you are to compare:  
+
In the exercises below you will be able to compare:  
 
* two Binomial distributions with different sets of parameters $I$ and $p$,  
 
* two Binomial distributions with different sets of parameters $I$ and $p$,  
 
* two Poisson distributions with different rates $\lambda$,  
 
* two Poisson distributions with different rates $\lambda$,  
Line 19: Line 23:
  
 
==Theoretical Background==
 
==Theoretical Background==
<br>
+
 
===Properties Of Binomial Distribution===
+
===Properties of the Binomial Distribution===
<br>
 
 
The ''Binomial distribution'' represents an important special case for the likelihood of occurence of a discrete random variable. For the derivation we assume, that $I$ binary and statistically independent random variables $b_i \in \{0, 1 \}$  can take
 
The ''Binomial distribution'' represents an important special case for the likelihood of occurence of a discrete random variable. For the derivation we assume, that $I$ binary and statistically independent random variables $b_i \in \{0, 1 \}$  can take
 
*the value $1$ with the probability ${\rm Pr}(b_i = 1) = p$, and
 
*the value $1$ with the probability ${\rm Pr}(b_i = 1) = p$, and
Line 27: Line 30:
  
  
Then the sum $z$, which is also a discrete random variable with the symbol stock $\{0, 1, 2, \text{...}\ , I\}$, is called binomially distributed:
+
The sum
:$$z=\sum_{i=1}^{I}b_i.$$
+
:$$z=\sum_{i=1}^{I}b_i$$
The symbol scope is thus $M = I + 1.$
+
is also a discrete random variable with symbols from the set $\{0, 1, 2, \cdots\ , I\}$ with size $M = I + 1$ and is called "binomially distributed".
  
  
'''Probabilities of binomial distribution'''
+
'''Probabilities of the Binomial Distribution'''
+
 
Following formular applies for $μ = 0, \text{...}\ , I$:
+
The probabilities to find $z = \mu$ for $μ = 0, \text{...}\ , I$ are given as
 
:$$p_\mu = {\rm Pr}(z=\mu)={I \choose \mu}\cdot p^\mu\cdot ({\rm 1}-p)^{I-\mu},$$
 
:$$p_\mu = {\rm Pr}(z=\mu)={I \choose \mu}\cdot p^\mu\cdot ({\rm 1}-p)^{I-\mu},$$
whereby the first term provides the number of combinations $(I \text{ over }\mu)$ :  
+
with  the number of combinations $(I \text{ over }\mu)$:
 
:$${I \choose \mu}=\frac{I !}{\mu !\cdot (I-\mu) !}=\frac{ {I\cdot (I- 1) \cdot \ \cdots \ \cdot (I-\mu+ 1)} }{ 1\cdot  2\cdot \ \cdots \ \cdot  \mu}.$$
 
:$${I \choose \mu}=\frac{I !}{\mu !\cdot (I-\mu) !}=\frac{ {I\cdot (I- 1) \cdot \ \cdots \ \cdot (I-\mu+ 1)} }{ 1\cdot  2\cdot \ \cdots \ \cdot  \mu}.$$
  
  
'''Moments of Binomial Distribution'''
 
  
Following formular applies to a binomially distributed random variable of order $k$:  
+
'''Moments of the Binomial Distribution'''
 +
 
 +
Consider a binomially distributed random variable $z$ and its expected value of order $k$:
 
:$$m_k={\rm E}[z^k]=\sum_{\mu={\rm 0}}^{I}\mu^k\cdot{I \choose \mu}\cdot p^\mu\cdot ({\rm 1}-p)^{I-\mu}.$$
 
:$$m_k={\rm E}[z^k]=\sum_{\mu={\rm 0}}^{I}\mu^k\cdot{I \choose \mu}\cdot p^\mu\cdot ({\rm 1}-p)^{I-\mu}.$$
  
From it we can derive the formulars for:
+
We can derive the formulas for
 
*the linear average:  &nbsp; $m_1 = I\cdot p,$
 
*the linear average:  &nbsp; $m_1 = I\cdot p,$
*the quadratic average: &nbsp; $m_2 = (I^2-I)\cdot p^2+I\cdot p,$
+
*the second moment: &nbsp; $m_2 = (I^2-I)\cdot p^2+I\cdot p,$
*the variance and dispersion: &nbsp; $\sigma^2 = {m_2-m_1^2} = {I \cdot p\cdot (1-p)} \hspace{0.3cm}\Rightarrow \hspace{0.3cm}
+
*the variance and standard deviation: &nbsp; $\sigma^2 = {m_2 - m_1^2} = {I \cdot p\cdot (1-p)} \hspace{0.3cm}\Rightarrow \hspace{0.3cm}
 
\sigma =  \sqrt{I \cdot p\cdot (1-p)}.$
 
\sigma =  \sqrt{I \cdot p\cdot (1-p)}.$
  
  
'''
+
'''Applications of the Binomial Distribution'''
Applications of Binomial Distribution'''
 
  
The Binomial distribution has a variety of uses in telecommunications as well as other disciplines :   
+
The Binomial distribution has a variety of uses in telecommunications as well as in other disciplines:   
*It characterizes the distribution of rejected parts (Ausschussstücken) in the statistical quality control.  
+
*It characterizes the distribution of rejected parts (Ausschussstücken) in statistical quality control.
*The bit error rate of a digital transmission system gained through simulation is technically a binomially distributed random variable as well.
+
*The simulated bit error rate of a digital transmission system is technically a binomially distributed random variable.
*The binomial distribution can be used to calculate the residual error probability with blockwise coding, as the following example shows.  
+
*The binomial distribution can be used to calculate the residual error probability with blockwise coding, as the following example shows.
  
  
 
{{GraueBox|TEXT=   
 
{{GraueBox|TEXT=   
 
$\text{Example 1:}$&nbsp;
 
$\text{Example 1:}$&nbsp;
Überträgt man jeweils Blöcke von $I =5$ Binärsymbolen über einen Kanal, der
+
When transfering  blocks of $I =5$ binary symbols through a channel, that
*mit der Wahrscheinlichkeit $p = 0.1$ ein Symbol verfälscht &nbsp; &rArr; &nbsp; Zufallsgröße $e_i = 1$, und
+
*distorts a symbol with probability $p = 0.1$ &nbsp; &rArr; &nbsp; random variable $e_i = 1$, and
*entsprechend mit der Wahrscheinlichkeit $1 - p = 0.9$ das Symbol unverfälscht überträgt &nbsp; &rArr; &nbsp; Zufallsgröße $e_i = 0$,
+
*transfers the symbol undistorted with probability $1 - p = 0.9$  &nbsp; &rArr; &nbsp; random variable $e_i = 0$,
 
   
 
   
  
so gilt für die neue Zufallsgröße $f$  (&bdquo;Fehler pro Block&rdquo;):  
+
the new random variable $f$  ("error per block") calculates to:  
 
:$$f=\sum_{i=1}^{I}e_i.$$
 
:$$f=\sum_{i=1}^{I}e_i.$$
  
Die Zufallsgröße $f$ kann nun alle ganzzahligen Werte zwischen $\mu = 0$ (kein Symbol verfälscht) und $\mu = I$ (alle fünf Symbole falsch) annehmen. Die Wahrscheinlichkeiten für $\mu$ Verfälschungen bezeichnen wir mit $p_μ = {\rm Pr}(f = \mu)$.  
+
$f$ can now take integer values between $\mu = 0$ (all symbols are correct) and $\mu = I = 5$ (all five symbols are erroneous). We describe the probability of  $\mu$ errors as $p_μ = {\rm Pr}(f = \mu)$.  
*Der Fall, dass alle fünf Symbole richtig übertragen werden, tritt mit der Wahrscheinlichkeit $p_0 = 0.9^{5} ≈ 0.5905$ ein. Dies ergibt sich auch aus der Binomialformel für $μ = 0$ unter Berücksichtigung der Definition „10 über 0“ = 1.  
+
*The case that all five symbols are transmitted correctly occurs with the probability of $p_0 = 0.9^{5} ≈ 0.5905$. This can also be seen from the binomial formula for $μ = 0$ , considering the definition $5\text{ over } 0 = 1$.  
*Ein einziger Symbolfehler $(f = 1)$ tritt mit der Wahrscheinlichkeit $p_1 = 5\cdot 0.1\cdot 0.9^4\approx 0.3281$ auf. Der erste Faktor berücksichtigt, dass es für die Position eines einzigen Fehlers genau $5\text{ über } 1 = 5$ Möglichkeiten gibt. Die beiden weiteren Faktoren beücksichtigen, dass ein Symbol verfälscht und vier richtig übertragen werden müssen, wenn $f =1$ gelten soll.  
+
*A single error $(f = 1)$ occurs with the probability $p_1 = 5\cdot 0.1\cdot 0.9^4\approx 0.3281$. The first factor indicates, that there are $5\text{ over } 1 = 5$ possibe error positions. The other two factors take into account, that one symbol was erroneous and the other four are correct when $f =1$.  
*Für $f =2$ gibt es mehr Kombinationen, nämlich$5\text{ über } 2 = (5 \cdot 4)/(1 \cdot 2) = 10$, und man erhält $p_2 = 10\cdot 0.1^2\cdot 0.9^3\approx 0.0729$.
+
*For $f =2$ there are $5\text{ over } 2 = (5 \cdot 4)/(1 \cdot 2) = 10$ combinations and you get a probability of $p_2 = 10\cdot 0.1^2\cdot 0.9^3\approx 0.0729$.
  
  
Kann ein Blockcode bis zu zwei Fehlern korrigieren, so ist die Restfehlerwahrscheinlichkeit $p_{\rm R} =  1-p_{\rm 0}-p_{\rm 1}-p_{\rm 2}\approx 0.85\%$. Eine zweite Berechnungsmöglichkeit wäre $p_{\rm R} =  p_{3}  + p_{4} + p_{5}$ mit der Näherung $p_{\rm R} \approx p_{3} = 0.81\%.$
+
If a block code can correct up to two errors, the residual error probability is $p_{\rm R} =  1-p_{\rm 0}-p_{\rm 1}-p_{\rm 2}\approx 0.85\%$.  
 +
A second calculation option would be $p_{\rm R} =  p_{3}  + p_{4} + p_{5}$ with the approximation $p_{\rm R} \approx p_{3} = 0.81\%.$
  
Die mittlere  Fehleranzahl in einem Block ist $m_f = 5 \cdot 0.1 = 0.5$. Die Varianz der Zufallsgröße $f$ beträgt $\sigma_f^2 = 5 \cdot 0.1 \cdot 0.9= 0.45$ &nbsp; &rArr; &nbsp;  $\sigma_f \approx 0.671.$}}
+
The average number of errors in a block is $m_f = 5 \cdot 0.1 = 0.5$ and the variance of the random variable $f$ is $\sigma_f^2 = 5 \cdot 0.1 \cdot 0.9= 0.45$ &nbsp; &rArr; &nbsp;  standard deviation $\sigma_f \approx 0.671.$}}
  
===Eigenschaften der Poissonverteilung===
 
<br>
 
Die ''Poissonverteilung'' ist ein Grenzfall der Binomialverteilung, wobei
 
*zum einen von den Grenzübergängen $I → ∞$ und $p →$ 0 ausgegangen wird,
 
*zusätzlich vorausgesetzt ist, dass das Produkt $I · p = λ$ einen endlichen Wert besitzt.
 
  
 +
===Properties of the Poisson Distribution===
  
Der Parameter $λ$ gibt die mittlere Anzahl der „Einsen” in einer festgelegten Zeiteinheit an und wird als die '''Rate''' bezeichnet.  
+
The ''Poisson distribution'' is a special case of the Binomial distribution, where
 +
* $I → \infty$ and $p →0$.
 +
*Additionally, the parameter $λ = I · p$ must be finite.  
  
Im Gegensatz zur Binomialverteilung ($0 ≤ μ ≤ I$) kann hier die Zufallsgröße beliebig große (ganzzahlige, nichtnegative) Werte annehmen, was bedeutet, dass die Menge der möglichen Werte hier nicht abzählbar ist. Da jedoch keine Zwischenwerte auftreten können, spricht man auch hier von einer ''diskreten Verteilung''.
 
  
 +
The parameter $λ$ indicates the average number of "ones" in a specified time unit and is called ''rate''.
  
'''Wahrscheinlichkeiten der Poissonverteilung'''
+
Unlike the Binomial distribution where $0 ≤ μ ≤ I$, here, the random variable can assume arbitrarily large non-negative integers, which means that the number of possible values is not countable. However, since no intermediate values ​​can occur, the Poisson distribution is still a "discrete distribution".
  
Berücksichtigt man die oben genannten Grenzübergänge in der Gleichung für die Wahrscheinlichkeiten der Binomialverteilung, so folgt für die Auftrittswahrscheinlichkeiten der poissonverteilten Zufallsgröße $z$:  
+
 
 +
'''Probabilities of the Poisson Distribution'''
 +
 
 +
With the limits $I → \infty$ and $p →0$, the likelihood of occurence of the Poisson distributed random variable $z$ can be derived from the probabilities of the Binomial distribution:
 
:$$p_\mu = {\rm Pr} ( z=\mu ) = \lim_{I\to\infty} \cdot \frac{I !}{\mu ! \cdot (I-\mu  )!} \cdot (\frac{\lambda}{I}  )^\mu \cdot  ( 1-\frac{\lambda}{I})^{I-\mu}.$$
 
:$$p_\mu = {\rm Pr} ( z=\mu ) = \lim_{I\to\infty} \cdot \frac{I !}{\mu ! \cdot (I-\mu  )!} \cdot (\frac{\lambda}{I}  )^\mu \cdot  ( 1-\frac{\lambda}{I})^{I-\mu}.$$
Daraus erhält man nach einigen algebraischen Umformungen:
+
After some algebraic transformations we finally obtain
 
:$$p_\mu = \frac{ \lambda^\mu}{\mu!}\cdot {\rm e}^{-\lambda}.$$
 
:$$p_\mu = \frac{ \lambda^\mu}{\mu!}\cdot {\rm e}^{-\lambda}.$$
  
 +
'''Moments of the Poisson Distribution'''
  
'''Momente der Poissonverteilung'''
+
The moments of the Poisson distribution can be derived directly from the corresponding equations of the Binomial distribution by taking the limits again:
 
+
:$$m_1 =\lim_{\left.{I\hspace{0.05cm}\to\hspace{0.05cm}\infty, \hspace{0.2cm}  {p\hspace{0.05cm}\to\hspace{0.05cm} 0}}\right.} \hspace{0.2cm} I \cdot p= \lambda,\hspace{0.8cm}
Bei der Poissonverteilung ergeben sich Mittelwert und Streuung direkt aus den entsprechenden Gleichungen der Binomialverteilung durch zweifache Grenzwertbildung:
+
\sigma =\lim_{\left.{I\hspace{0.05cm}\to\hspace{0.05cm}\infty, \hspace{0.2cm}  {p\hspace{0.05cm}\to\hspace{0.05cm} 0}}\right.} \hspace{0.2cm} \sqrt{I \cdot p \cdot (1-p)} = \sqrt {\lambda}.$$
:$$m_1 =\lim_{\left.{I\hspace{0.05cm}\to\hspace{0.05cm}\infty \atop {p\hspace{0.05cm}\to\hspace{0.05cm} 0}}\right.} I \cdot p= \lambda,$$
 
:$$\sigma =\lim_{\left.{I\hspace{0.05cm}\to\hspace{0.05cm}\infty \atop {p\hspace{0.05cm}\to\hspace{0.05cm} 0}}\right.} \sqrt{I \cdot p \cdot (1-p)} = \sqrt {\lambda}.$$
 
  
Daraus ist zu erkennen, dass bei der Poissonverteilung stets $\sigma^2 = m_1 = \lambda$ ist. Dagegen gilt bei der Binomialverteilung immer $\sigma^2 < m_1$.
+
We can see that for the Poisson distribution $\sigma^2 = m_1 = \lambda$ always holds. In contrast, the moments of the Binomial distribution always fulfill $\sigma^2 < m_1$.
  
[[File: P_ID616__Sto_T_2_4_S2neu.png |frame| Momente der Poissonverteilung | rechts]]
+
[[File: P_ID616__Sto_T_2_4_S2neu.png |frame| Moments of Poisson Distribution]]
 
{{GraueBox|TEXT=   
 
{{GraueBox|TEXT=   
$\text{Beispiel 2:}$&nbsp;
+
$\text{Example 2:}$&nbsp;
Wir vergleichen nun
+
We now compare the Binomial distribution with parameters $I =6$ und $p = 0.4$ with the Poisson distribution with $λ = 2.4$:  
*die Binomialverteilung mit den Parametern $I =6$ und $p = 0.4$, und
+
*Both distributions have the same linear average $m_1 = 2.4$.  
*die Poissonverteilung mit $λ = 2.4$:  
+
*The standard deviation of the Poisson distribution (marked red in the figure) is $σ ≈ 1.55$.  
*Beide Verteilungen besitzen genau den gleichen Mittelwert $m_1 = 2.4$.  
+
*The standard deviation of the Binomial distribution (marked blue) is $σ = 1.2$.
*Bei der Poissonverteilung (im Bild rot markiert) beträgt die Streuung $σ ≈ 1.55$.  
+
}}
*Bei der (blauen) Binomialverteilung ist die Standardabweichung nur $σ = 1.2$.}}
 
  
  
'''Anwendungen der Poissonverteilung'''
+
'''Applications of the Poisson Distribution'''
  
Die Poissonverteilung ist das Ergebnis eines so genannten ''Poissonprozesses''. Ein solcher dient häufig als Modell für Folgen von Ereignissen, die zu zufälligen Zeitpunkten eintreten können. Beispiele für derartige Ereignisse sind
+
The Poisson distribution is the result of a so-called ''Poisson point process'' which is often used as a model for a series of events that may occur at random times. Examples of such events are
*der Ausfall von Geräten – eine wichtige Aufgabenstellung in der Zuverlässigkeitstheorie,  
+
* failure of devices - an important task in reliability theory,  
*das Schrotrauschen bei der optischen Übertragung, und
+
* shot noise in the optical transmission simulations, and
*der Beginn von Telefongesprächen in einer Vermittlungsstelle („Verkehrstheorie”).  
+
* the start of conversations in a telephone relay center („Teletraffic engineering”).  
  
  
 
{{GraueBox|TEXT=   
 
{{GraueBox|TEXT=   
$\text{Beispiel 3:}$&nbsp;
+
$\text{Example 3:}$&nbsp;
Gehen bei einer Vermittlungsstelle im Langzeitmittel neunzig Vermittlungswünsche pro Minute (entsprechend $λ = 1.5 \text{ pro Sekunde}$) ein, so lauten die Wahrscheinlichkeiten $p_µ$, dass in einem beliebigen Zeitraum von einer Sekunde genau $\mu$ Belegungen auftreten:  
+
A telephone relay receives ninety requests per minute on average $(λ = 1.5 \text{ per second})$. The probabilities $p_µ$, that in an arbitrarily large time frame exactly $\mu$ requests are received, is:  
 
:$$p_\mu = \frac{1.5^\mu}{\mu!}\cdot {\rm e}^{-1.5}.$$
 
:$$p_\mu = \frac{1.5^\mu}{\mu!}\cdot {\rm e}^{-1.5}.$$
  
Es ergeben sich die Zahlenwerte $p_0 = 0.223$, $p_1 = 0.335$, $p_2 = 0.251$, usw.  
+
The resulting numerical values are $p_0 = 0.223$, $p_1 = 0.335$, $p_2 = 0.251$, etc.  
  
Daraus lassen sich weitere Kenngrößen ableiten:
+
From this, additional parameters can be derived:
*Die Abstand $τ$ zwischen zwei Vermittlungswünschen genügt der ''Exponentialverteilung''.
+
* The distance $τ$ between two requests satisfies the "exponential distribution",
*Die mittlere Zeitspanne zwischen Vermittlungswünschen beträgt ${\rm E}[τ] = 1/λ ≈ 0.667 \ \rm s$.}}
+
* The mean time span between two requests is ${\rm E}[τ] = 1/λ ≈ 0.667 \ \rm s$.
 +
}}
  
  
 +
=== Comparison of Binomial and Poisson Distribution ===
 +
This section deals with the similarities and differences between Binomial and Poisson distributions.
  
===Gegenüberstellung Binomialverteilung vs. Poissonverteilung===
+
[[File:  EN_Sto_T_2_4_S3.png |frame| Binomial vs. Poisson distribution]]
<br>
 
Hier sollen die Gemeinsamkeiten und die Unterschiede zwischen binomial- und poissonverteilten Zufallsgrößen herausgearbeitet werden.
 
  
[[File:  P_ID60__Sto_T_2_4_S3_neu.png |frame| Binomialverteilung vs. Poissonverteilung]]
+
The '''Binomial distribution''' is used to describe stochastic events, that have a fixed period $T$. For example the period of an ISDN  (''Integrated Services Digital Network'') network with $64 \ \rm kbit/s$ is $T \approx 15.6 \ \rm \mu s$.  
Die '''Binomialverteilung''' ist zur Beschreibung solcher stochastischer Ereignisse geeignet, die durch einen festen Takt $T$ gekennzeichnet sind. Beispielsweise beträgt bei ISDN  (''Integrated Services Digital Network'') mit $64 \ \rm kbit/s$ die Taktzeit $T \approx 15.6 \ \rm &micro; s$.  
+
* Binary events such as the error-free $(e_i = 0)$/ faulty $(e_i = 1)$ transmission of individual symbols only occur in this time frame.
*Nur in diesem Zeitraster treten binäre Ereignisse auf. Solche Ereignisse sind beispielsweise die fehlerfreie $(e_i = 0)$ oder fehlerhafte $(e_i = 1)$ Übertragung einzelner Symbole.  
+
* With the Binomial distribution, it is possible to make statistical statements about the number of expected erros in a period $T_{\rm I} = I · T$, as is shown in the time figure above (marked blue).
*Die Binomialverteilung ermöglicht nun statistische Aussagen über die Anzahl der in einem längeren Zeitintervall $T_{\rm I} = I · T$ zu erwartenden Übertragungsfehler entsprechend des oberen Zeitdiagramms (blau markierte Zeitpunkte).
+
* For very large values of $I$ and very small values of $p$, the Binomial distribution can be approximated by the ''Poisson distribution'' with rate $\lambda = I \cdot p$.  
*Für sehr große Werte von $I$ kann die Binomialverteilung durch die ''Poissonverteilung'' angenähert werden.  
+
* If at the same time $I · p \gg 1$, the Poisson distribution as well as the Binomial distribution turn into a discrete Gaussian distribution according to the ''de Moivre-Laplace Theorem''.
*Ist gleichzeitig das Produkt $I · p \gg 1$, so geht nach dem ''Grenzwertsatz von de Moivre-Laplace'' die Poissonverteilung (und damit auch die Binomialverteilung) in eine diskrete Gaußverteilung über.
 
  
  
Die '''Poissonverteilung''' macht ebenfallsAussagen über die Anzahl eintretender Binärereignisse in einem endlichen Zeitintervall.
+
The '''Poisson distribution''' can also be used to make statements about the number of occuring binary events in a finite time interval.
 
   
 
   
Geht man hierbei vom gleichen Betrachtungszeitraum $T_{\rm I}$ aus und vergrößert die Anzahl $I$ der Teilintervalle immer mehr, so wird die Taktzeit $T$, zu der jeweils ein neues Binärereignis ($0$ oder $1$) eintreten kann, immer kleiner. Im Grenzfall geht $T$ gegen Null. Das heißt:  
+
By assuming the same observation period $T_{\rm I}$ and increasing the number of partial periods $I$, the period $T$, in which a new event ($0$ or $1$) can occur, gets smaller and smaller. In the limit where $T$ goes to zero, this means:  
*Bei der Poissonverteilung sind die binären Ereignisse nicht nur zu diskreten, durch ein Zeitraster vorgegebenen Zeitpunkten möglich, sondern jederzeit. Das untere Zeitdiagramm verdeutlicht diesen Sachverhalt.  
+
* With the Poisson distribution binary events can not only occur at certain given times, but at any time, which is illustrated in the second time chart.  
*Um im Mittel während der Zeit $T_{\rm I}$ genau so viele „Einsen” wie bei der Binomialverteilung zu erhalten (im Beispiel: sechs), muss allerdings die auf das infinitesimal kleine Zeitintervall $T$ bezogene charakteristische Wahrscheinlichkeit $p = {\rm Pr}( e_i = 1)$ gegen Null tendieren.
+
* In order to get the same number of "ones" in the period $T_{\rm I}$ - in average - as in the Binomial distribution (six pulses in the example), the characteristic probability  $p = {\rm Pr}( e_i = 1)$ for an infinitesimal small time interval $T$ must go to zero.
  
==Versuchsdurchführung==
 
<br>
 
In der folgenden Beschreibung bedeutet
 
  
*Blau: Verteilungsfunktion 1 (im Applet blau markiert)
+
==Exercises==
  
*Rot: Verteilungsfunktion 2 (im Applet rot markiert)
+
* First select the number&nbsp; $(1,\text{...}, 7)$&nbsp; of the exercise.&nbsp; The number&nbsp; $0$&nbsp; corresponds to a "Reset":&nbsp; Same setting as at program start.
 +
*A task description is displayed.&nbsp; The parameter values ​​are adjusted.&nbsp; Solution after pressing "Show solution". <br>
 +
*In these exercises, the term&nbsp; '''Blue'''&nbsp; refers to distribution function 1 (marked blue in the applet) and the term&nbsp; '''Red'''&nbsp; refers to distribution function 2 (marked red in applet).
  
  
 
{{BlaueBox|TEXT=
 
{{BlaueBox|TEXT=
'''(1)'''&nbsp; Setzen Sie '''Blau''': Binomialverteilung $(I=5, \ p=0.4)$ und '''Rot''': Binomialverteilung $(I=10, \ p=0.2)$.
+
'''(1)'''&nbsp; Set '''Blue''' to Binomial distribution $(I=5, \ p=0.4)$ and '''Red''' to Binomial distribution $(I=10, \ p=0.2)$.
:Wie lauten die Wahrscheinlichkeiten ${\rm Pr}(z=0)$ und ${\rm Pr}(z=1)$?}}
+
:What are the probabilities ${\rm Pr}(z=0)$ and ${\rm Pr}(z=1)$?
 +
}}
  
  
$\hspace{1.0cm}\Rightarrow\hspace{0.3cm}\text{Blau: }{\rm Pr}(z=0)=0.6^5=7.78\%, \hspace{0.3cm}{\rm Pr}(z=1)=0.4 \cdot 0.6^4=25.92\%;$
+
$\hspace{1.0cm}\Rightarrow\hspace{0.3cm}\text{Blue: }{\rm Pr}(z=0)=0.6^5=7.78\%, \hspace{0.3cm}{\rm Pr}(z=1)=0.4 \cdot 0.6^4=25.92\%;$
  
$\hspace{1.85cm}\text{Rot: }{\rm Pr}(z=0)=0.8^10=10.74\%, \hspace{0.3cm}{\rm Pr}(z=1)=0.2 \cdot 0.8^9=26.84\%.$
+
$\hspace{1.85cm}\text{Red: }{\rm Pr}(z=0)=0.8^{10}=10.74\%, \hspace{0.3cm}{\rm Pr}(z=1)=0.2 \cdot 0.8^9=26.84\%.$
  
 
{{BlaueBox|TEXT=
 
{{BlaueBox|TEXT=
'''(2)'''&nbsp; Es gelten weiter die Einstellungen von '''(1)'''. Wie groß sind die Wahrscheinlichkeiten ${\rm Pr}(3 \le z \le 5)$?}}
+
'''(2)'''&nbsp; Using the same settings as in '''(1)''', what are the probabilities ${\rm Pr}(3 \le z \le 5)$?
 +
}}
  
  
$\hspace{1.0cm}\Rightarrow\hspace{0.3cm}\text{Es gilt }{\rm Pr}(3 \le z \le 5) = {\rm Pr}(z=3) + {\rm Pr}(z=4) + {\rm Pr}(z=5)\text{, oder }
+
$\hspace{1.0cm}\Rightarrow\hspace{0.3cm}\text{Note that }{\rm Pr}(3 \le z \le 5) = {\rm Pr}(z=3) + {\rm Pr}(z=4) + {\rm Pr}(z=5)\text{, or }
 
{\rm Pr}(3 \le z \le 5) = {\rm Pr}(z \le 5) - {\rm Pr}(z \le 2)$
 
{\rm Pr}(3 \le z \le 5) = {\rm Pr}(z \le 5) - {\rm Pr}(z \le 2)$
  
$\hspace{1.85cm}\text{Blau: }{\rm Pr}(3 \le z \le 5) = 0.2304+ 0.0768 + 0.0102 =1 - 0.6826 = 0.3174;$
+
$\hspace{1.85cm}\text{Blue: }{\rm Pr}(3 \le z \le 5) = 0.2304+ 0.0768 + 0.0102 =1 - 0.6826 = 0.3174;$
  
$\hspace{1.85cm}\text{Rot: }{\rm Pr}(3 \le z \le 5) = 0.2013 + 0.0881 + 0.0264 = 0.9936 - 0.6778 = 0.3158$
+
$\hspace{1.85cm}\text{Red: }{\rm Pr}(3 \le z \le 5) = 0.2013 + 0.0881 + 0.0264 = 0.9936 - 0.6778 = 0.3158.$
  
 
{{BlaueBox|TEXT=
 
{{BlaueBox|TEXT=
'''(3)'''&nbsp; Es gelten weiter die Einstellungen von '''(1)'''. Wie unterscheiden sich der Mittelwert $m_1$ und die Streuung $\sigma$ der beiden Binomialverteilungen?}}
+
'''(3)'''&nbsp; Using the same settings as in '''(1)''', what are the differences in the linear average $m_1$ and the standard deviation $\sigma$ between the two Binomial distributions?
 +
}}
  
  
$\hspace{1.0cm}\Rightarrow\hspace{0.3cm}\text{Mittelwert:}\hspace{0.2cm}m_\text{1} = I \cdot p\hspace{0.3cm} \Rightarrow\hspace{0.3cm} m_\text{1, Blau}  = 5 \cdot 0.4\underline{ = 2 =}  \ m_\text{1, Rot} = 10 \cdot 0.2; $
+
$\hspace{1.0cm}\Rightarrow\hspace{0.3cm}\text{Average:}\hspace{0.2cm}m_\text{1} = I \cdot p\hspace{0.3cm} \Rightarrow\hspace{0.3cm} m_\text{1, Blue}  = 5 \cdot 0.4\underline{ = 2 =}  \ m_\text{1, Red} = 10 \cdot 0.2; $
  
$\hspace{1.85cm}\text{Streuung:}\hspace{0.4cm}\sigma = \sqrt{I \cdot p \cdot (1-p)} = \sqrt{m_1 \cdot (1-p)}\hspace{0.3cm}\Rightarrow\hspace{0.3cm} \sigma_{\rm Blau} = \sqrt{2 \cdot 0.6} =1.095 \le \sigma_{\rm Rot} = \sqrt{2 \cdot 0.8} = 1.265.$
+
$\hspace{1.85cm}\text{Standard deviation:}\hspace{0.4cm}\sigma = \sqrt{I \cdot p \cdot (1-p)} = \sqrt{m_1 \cdot (1-p)}\hspace{0.3cm}\Rightarrow\hspace{0.3cm} \sigma_{\rm Blue} = \sqrt{2 \cdot 0.6} =1.095 < \sigma_{\rm Red} = \sqrt{2 \cdot 0.8} = 1.265.$
  
 
{{BlaueBox|TEXT=
 
{{BlaueBox|TEXT=
'''(4)'''&nbsp; Setzen Sie '''Blau''': Binomialverteilung $(I=15, p=0.3)$ und '''Rot''': Poissonverteilung $(\lambda=4.5)$.
+
'''(4)'''&nbsp; Set '''Blue''' to Binomial distribution $(I=15, p=0.3)$ and '''Red''' to Poisson distribution $(\lambda=4.5)$.
:Welche Unterschiede ergeben sich  zwischen beiden Verteilungen hinsichtlich Mittelwert $m_1$ und Varianz $\sigma^2$?}}
+
:What differences arise between both distributions regarding the average $m_1$ and variance $\sigma^2$?
 +
}}
  
  
$\hspace{1.0cm}\Rightarrow\hspace{0.3cm}\text{Beide Verteilungern haben gleichen Mittelwert:}\hspace{0.2cm}m_\text{1, Blau}  =  I \cdot p\ = 15 \cdot 0.3\hspace{0.15cm}\underline{ = 4.5 =} \  m_\text{1, Rot} = \lambda$;
+
$\hspace{1.0cm}\Rightarrow\hspace{0.3cm}\text{Both distributions have the same average:}\hspace{0.2cm}m_\text{1, Blue}  =  I \cdot p\ = 15 \cdot 0.3\hspace{0.15cm}\underline{ = 4.5 =} \  m_\text{1, Red} = \lambda$;
  
$\hspace{1.85cm} \text{Binomialverteilung: }\hspace{0.2cm} \sigma_\text{Blau}^2 = m_\text{1, Blau} \cdot (1-p)\hspace{0.15cm}\underline { = 3.15} \le \text{Poissonverteilung: }\hspace{0.2cm} \sigma_\text{Rot}^2 = \lambda\hspace{0.15cm}\underline { = 4.5}$;
+
$\hspace{1.85cm} \text{Binomial distribution: }\hspace{0.2cm} \sigma_\text{Blue}^2 = m_\text{1, Blue} \cdot (1-p)\hspace{0.15cm}\underline { = 3.15} < \text{Poisson distribution: }\hspace{0.2cm} \sigma_\text{Red}^2 = \lambda\hspace{0.15cm}\underline { = 4.5}$;
  
 
{{BlaueBox|TEXT=
 
{{BlaueBox|TEXT=
'''(5)'''&nbsp; Es gelten die Einstellungen von '''(4)'''. Wie groß sind die Wahrscheinlichkeiten ${\rm Pr}(z  \gt 10)$ und ${\rm Pr}(z \gt 15)$}}
+
'''(5)'''&nbsp; Using the same settings as in '''(4)''', what are the probabilities ${\rm Pr}(z  \gt 10)$ and ${\rm Pr}(z \gt 15)$?
 +
}}
 +
 
  
 +
$\hspace{1.0cm}\Rightarrow\hspace{0.3cm} \text{Binomial: }\hspace{0.2cm} {\rm Pr}(z  \gt 10) = 1 - {\rm Pr}(z  \le 10) = 1 - 0.9993 = 0.0007;\hspace{0.3cm} {\rm Pr}(z \gt 15) = 0 \ {\rm (exactly)}$.
  
$\hspace{1.0cm}\Rightarrow\hspace{0.3cm} \text{Binomial: }\hspace{0.2cm} {\rm Pr}(z  \gt 10) = 1 - {\rm Pr}(z  \le 10) = 1 - 0.9993 = 0.0007;\hspace{0.3cm} {\rm Pr}(z \gt 15) = 0$.
+
$\hspace{1.85cm}\text{Poisson: }\hspace{0.2cm} {\rm Pr}(z  \gt 10) = 1 - 0.9933 = 0.0067;\hspace{0.3cm}{\rm Pr}(z \gt 15) \gt 0\hspace{0.2cm}( \approx 0)$;
  
$\hspace{1.85cm}\text{Poisson: }\hspace{0.2cm} {\rm Pr}(z  \gt 10) = 1 - 0.9933 = 0.0067;\hspace{0.3cm}{\rm Pr}(z \gt 15) \gt 0\hspace{0.5cm}\text{Näherung: }\hspace{0.2cm}{\rm Pr}(z \gt 15) \ge {\rm Pr}(z = 16) = \frac{\lambda^{16} }{16!}\approx 2 \cdot 10^{-22}$
+
$\hspace{1.85cm}\text{Approximation: }\hspace{0.2cm}{\rm Pr}(z \gt 15) \ge {\rm Pr}(z = 16) = \lambda^{16} /{16!}\approx 2 \cdot 10^{-22}$
  
 
{{BlaueBox|TEXT=
 
{{BlaueBox|TEXT=
'''(6)'''&nbsp; Es gelten weiter die Einstellungen von '''(4)'''. Mit welchen Parametern ergeben sich symmetrische Verteilungen um $m_1$?}}
+
'''(6)'''&nbsp; Using the same settings as in '''(4)''', which parameters lead to a symmetric distribution around $m_1$?
 +
}}
 +
 
 +
 
 +
$\hspace{1.0cm}\Rightarrow\hspace{0.3cm} \text{Binomial distribution with }p = 0.5\text{:  }p_\mu =  {\rm Pr}(z  = \mu)\text{ symmetric around } m_1 = I/2 = 7.5 \ ⇒  \ p_μ = p_{I–μ}\ ⇒  \  p_8 = p_7, \ p_9 = p_6,  \text{etc.}$
 +
 
 +
$\hspace{1.85cm}\text{In contrast, the Poisson distribution is never symmetric, since it extends to infinity!}$
 +
 
 +
 
 +
==Applet Manual==
 +
[[File:Handhabung_binomial.png|left|600px]]
 +
&nbsp; &nbsp; '''(A)''' &nbsp; &nbsp; Preselection for blue parameter set
 +
 
 +
&nbsp; &nbsp; '''(B)''' &nbsp; &nbsp; Parameter input: Sliders $I$ and $p$
 +
 
 +
&nbsp; &nbsp; '''(C)''' &nbsp; &nbsp; Preselection for Red parameter set
 +
 
 +
&nbsp; &nbsp; '''(D)''' &nbsp; &nbsp; Parameter input: Slider $\lambda$
 +
 
 +
&nbsp; &nbsp; '''(E)''' &nbsp; &nbsp; Graphic display of the Distribution
 +
 
 +
&nbsp; &nbsp; '''(F)''' &nbsp; &nbsp; Output of moments for blue parameter set
 +
 
 +
&nbsp; &nbsp; '''(G)''' &nbsp; &nbsp; Output of moments for redparameter set
 +
 
 +
&nbsp; &nbsp; '''(H)''' &nbsp; &nbsp; Variation possibilities for the graphic display
 +
 
 +
$\hspace{1.5cm}$"$+$" (Zoom in),
 +
 
 +
$\hspace{1.5cm}$ "$-$" (Zoom out)
  
 +
$\hspace{1.5cm}$ "$\rm o$" (Reset)
  
$\hspace{1.0cm}\Rightarrow\hspace{0.3cm} \text{Binomialverung mit }p = 0.5\text{:  }p_\mu =  {\rm Pr}(z  = \mu)\text{ symmetrisch um } m_1 = I/2 = 7.5 \ ⇒  \ p_μ = p_{I–μ}\ ⇒  \  p_8 = p_7, \ p_9 = p_6\text{usw.}$
+
$\hspace{1.5cm}$ "$\leftarrow$" (Move left),  etc.
  
$\hspace{1.85cm}\text{Die Poissonverteilung wird dagegen nie symmetrisch, da sie sich bis ins Unendliche erstreckt!}$
+
&nbsp; &nbsp; '''( I )''' &nbsp; &nbsp; Output of ${\rm Pr} (z = \mu)$ and ${\rm Pr} (z  \le \mu)$  
  
 +
&nbsp; &nbsp; '''(J)''' &nbsp; &nbsp; Exercises: Exercise selection, description and solution
 +
<br clear=all>
 +
<br>'''Other options for graphic display''':
 +
*Hold shift and scroll: Zoom in on/out of coordinate system,
 +
*Hold shift and left click: Move the coordinate system.
  
==Über die Autoren==
+
==About the Authors==
Dieses interaktive Berechnungstool wurde am [http://www.lnt.ei.tum.de/startseite Lehrstuhl für Nachrichtentechnik] der [https://www.tum.de/ Technischen Universität München] konzipiert und realisiert.  
+
This interactive calculation was designed and realized at the&nbsp; [http://www.lnt.ei.tum.de/startseite Lehrstuhl für Nachrichtentechnik]&nbsp; of the&nbsp; [https://www.tum.de/ Technische Universität München].  
*Die erste Version wurde 2003 von [[Biografien_und_Bibliografien/An_LNTwww_beteiligte_Studierende#Ji_Li_.28Bachelorarbeit_EI_2003.2C_Diplomarbeit_EI_2005.29|Ji Li]] im Rahmen ihrer Diplomarbeit mit &bdquo;FlashMX&ndash;Actionscript&rdquo; erstellt (Betreuer: [[Biografien_und_Bibliografien/An_LNTwww_beteiligte_Mitarbeiter_und_Dozenten#Prof._Dr.-Ing._habil._G.C3.BCnter_S.C3.B6der_.28am_LNT_seit_1974.29|Günter Söder]]).  
+
*The original version was created in 2003 by&nbsp; [[Biographies_and_Bibliographies/An_LNTwww_beteiligte_Studierende#Ji_Li_.28Bachelorarbeit_EI_2003.2C_Diplomarbeit_EI_2005.29|Ji Li]] as part of her Diploma thesis using "FlashMX&ndash;Actionscript"&nbsp; (Supervisor:&nbsp; [[Biographies_and_Bibliographies/An_LNTwww_beteiligte_Mitarbeiter_und_Dozenten#Prof._Dr.-Ing._habil._G.C3.BCnter_S.C3.B6der_.28am_LNT_seit_1974.29|Günter Söder]]).  
*2018 wurde dieses Programm von [[Biografien_und_Bibliografien/An_LNTwww_beteiligte_Studierende#Jimmy_He_.28Bachelorarbeit_2018.29|Jimmy He]] im Rahmen seiner Bachelorarbeit (Betreuer: [[Biografien_und_Bibliografien/Beteiligte_der_Professur_Leitungsgebundene_%C3%9Cbertragungstechnik#Tasn.C3.A1d_Kernetzky.2C_M.Sc._.28bei_L.C3.9CT_seit_2014.29|Tasnád Kernetzky]] &ndash; Mitarbeiter der Professur &bdquo;Leitungsgebundene Übertragungstechnik&bdquo;) auf  &bdquo;HTML5&rdquo; umgesetzt und neu gestaltet.
+
*In 2018 this Applet was redesigned and updated to &quot;HTML5&quot; by&nbsp; [[Biographies_and_Bibliographies/An_LNTwww_beteiligte_Studierende#Jimmy_He_.28Bachelorarbeit_2018.29|Jimmy He]]&nbsp; as part of his Bachelor's thesis (Supervisor:&nbsp; [[Biographies_and_Bibliographies/Beteiligte_der_Professur_Leitungsgebundene_%C3%9Cbertragungstechnik#Tasn.C3.A1d_Kernetzky.2C_M.Sc._.28bei_L.C3.9CT_seit_2014.29|Tasnád Kernetzky]]) .
  
==Nochmalige Aufrufmöglichkeit des Applets in neuem Fenster==
+
==Once again: Open Applet in new Tab==
  
{{LntAppletLink|verzerrungen}}
+
{{LntAppletLinkEn|binomPoissonDistributions_en}} &nbsp; &nbsp; &nbsp; &nbsp; [https://www.lntwww.de/Applets:Binomial-_und_Poissonverteilung_(Applet) '''English Applet with German WIKI description''']

Latest revision as of 21:40, 26 March 2023

Open Applet in new Tab         English Applet with German WIKI description


Applet Description

This applet allows the calculation and graphical display of

  • the probabilities ${\rm Pr}(z=\mu)$ of a discrete random variable $z \in \{\mu \} = \{0, 1, 2, 3, \text{...} \}$, that determine its Probability Density Function (PDF) – here representation with Dirac delta functions ${\rm \delta}( z-\mu)$:
$$f_{z}(z)=\sum_{\mu=1}^{M}{\rm Pr}(z=\mu)\cdot {\rm \delta}( z-\mu),$$
  • the probabilities ${\rm Pr}(z \le \mu)$ of the Cumulative Distribution Function (CDF):
$$F_{z}(\mu)={\rm Pr}(z\le\mu).$$


Discrete distributions are available in two sets of parameters:

  • the Binomial distribution with the parameters $I$ and $p$   ⇒   $z \in \{0, 1, \text{...} \ , I \}$   ⇒   $M = I+1$ possible values,
  • the Poisson distribution with the parameter $\lambda$   ⇒   $z \in \{0, 1, 2, 3, \text{...}\}$   ⇒   $M \to \infty$.


In the exercises below you will be able to compare:

  • two Binomial distributions with different sets of parameters $I$ and $p$,
  • two Poisson distributions with different rates $\lambda$,
  • a Binomial distribution with a Poisson distribution.

Theoretical Background

Properties of the Binomial Distribution

The Binomial distribution represents an important special case for the likelihood of occurence of a discrete random variable. For the derivation we assume, that $I$ binary and statistically independent random variables $b_i \in \{0, 1 \}$ can take

  • the value $1$ with the probability ${\rm Pr}(b_i = 1) = p$, and
  • the value $0$ with the probability ${\rm Pr}(b_i = 0) = 1-p$.


The sum

$$z=\sum_{i=1}^{I}b_i$$

is also a discrete random variable with symbols from the set $\{0, 1, 2, \cdots\ , I\}$ with size $M = I + 1$ and is called "binomially distributed".


Probabilities of the Binomial Distribution

The probabilities to find $z = \mu$ for $μ = 0, \text{...}\ , I$ are given as

$$p_\mu = {\rm Pr}(z=\mu)={I \choose \mu}\cdot p^\mu\cdot ({\rm 1}-p)^{I-\mu},$$

with the number of combinations $(I \text{ over }\mu)$:

$${I \choose \mu}=\frac{I !}{\mu !\cdot (I-\mu) !}=\frac{ {I\cdot (I- 1) \cdot \ \cdots \ \cdot (I-\mu+ 1)} }{ 1\cdot 2\cdot \ \cdots \ \cdot \mu}.$$


Moments of the Binomial Distribution

Consider a binomially distributed random variable $z$ and its expected value of order $k$:

$$m_k={\rm E}[z^k]=\sum_{\mu={\rm 0}}^{I}\mu^k\cdot{I \choose \mu}\cdot p^\mu\cdot ({\rm 1}-p)^{I-\mu}.$$

We can derive the formulas for

  • the linear average:   $m_1 = I\cdot p,$
  • the second moment:   $m_2 = (I^2-I)\cdot p^2+I\cdot p,$
  • the variance and standard deviation:   $\sigma^2 = {m_2 - m_1^2} = {I \cdot p\cdot (1-p)} \hspace{0.3cm}\Rightarrow \hspace{0.3cm} \sigma = \sqrt{I \cdot p\cdot (1-p)}.$


Applications of the Binomial Distribution

The Binomial distribution has a variety of uses in telecommunications as well as in other disciplines:

  • It characterizes the distribution of rejected parts (Ausschussstücken) in statistical quality control.
  • The simulated bit error rate of a digital transmission system is technically a binomially distributed random variable.
  • The binomial distribution can be used to calculate the residual error probability with blockwise coding, as the following example shows.


$\text{Example 1:}$  When transfering blocks of $I =5$ binary symbols through a channel, that

  • distorts a symbol with probability $p = 0.1$   ⇒   random variable $e_i = 1$, and
  • transfers the symbol undistorted with probability $1 - p = 0.9$   ⇒   random variable $e_i = 0$,


the new random variable $f$ ("error per block") calculates to:

$$f=\sum_{i=1}^{I}e_i.$$

$f$ can now take integer values between $\mu = 0$ (all symbols are correct) and $\mu = I = 5$ (all five symbols are erroneous). We describe the probability of $\mu$ errors as $p_μ = {\rm Pr}(f = \mu)$.

  • The case that all five symbols are transmitted correctly occurs with the probability of $p_0 = 0.9^{5} ≈ 0.5905$. This can also be seen from the binomial formula for $μ = 0$ , considering the definition $5\text{ over } 0 = 1$.
  • A single error $(f = 1)$ occurs with the probability $p_1 = 5\cdot 0.1\cdot 0.9^4\approx 0.3281$. The first factor indicates, that there are $5\text{ over } 1 = 5$ possibe error positions. The other two factors take into account, that one symbol was erroneous and the other four are correct when $f =1$.
  • For $f =2$ there are $5\text{ over } 2 = (5 \cdot 4)/(1 \cdot 2) = 10$ combinations and you get a probability of $p_2 = 10\cdot 0.1^2\cdot 0.9^3\approx 0.0729$.


If a block code can correct up to two errors, the residual error probability is $p_{\rm R} = 1-p_{\rm 0}-p_{\rm 1}-p_{\rm 2}\approx 0.85\%$. A second calculation option would be $p_{\rm R} = p_{3} + p_{4} + p_{5}$ with the approximation $p_{\rm R} \approx p_{3} = 0.81\%.$

The average number of errors in a block is $m_f = 5 \cdot 0.1 = 0.5$ and the variance of the random variable $f$ is $\sigma_f^2 = 5 \cdot 0.1 \cdot 0.9= 0.45$   ⇒   standard deviation $\sigma_f \approx 0.671.$


Properties of the Poisson Distribution

The Poisson distribution is a special case of the Binomial distribution, where

  • $I → \infty$ and $p →0$.
  • Additionally, the parameter $λ = I · p$ must be finite.


The parameter $λ$ indicates the average number of "ones" in a specified time unit and is called rate.

Unlike the Binomial distribution where $0 ≤ μ ≤ I$, here, the random variable can assume arbitrarily large non-negative integers, which means that the number of possible values is not countable. However, since no intermediate values ​​can occur, the Poisson distribution is still a "discrete distribution".


Probabilities of the Poisson Distribution

With the limits $I → \infty$ and $p →0$, the likelihood of occurence of the Poisson distributed random variable $z$ can be derived from the probabilities of the Binomial distribution:

$$p_\mu = {\rm Pr} ( z=\mu ) = \lim_{I\to\infty} \cdot \frac{I !}{\mu ! \cdot (I-\mu )!} \cdot (\frac{\lambda}{I} )^\mu \cdot ( 1-\frac{\lambda}{I})^{I-\mu}.$$

After some algebraic transformations we finally obtain

$$p_\mu = \frac{ \lambda^\mu}{\mu!}\cdot {\rm e}^{-\lambda}.$$

Moments of the Poisson Distribution

The moments of the Poisson distribution can be derived directly from the corresponding equations of the Binomial distribution by taking the limits again:

$$m_1 =\lim_{\left.{I\hspace{0.05cm}\to\hspace{0.05cm}\infty, \hspace{0.2cm} {p\hspace{0.05cm}\to\hspace{0.05cm} 0}}\right.} \hspace{0.2cm} I \cdot p= \lambda,\hspace{0.8cm} \sigma =\lim_{\left.{I\hspace{0.05cm}\to\hspace{0.05cm}\infty, \hspace{0.2cm} {p\hspace{0.05cm}\to\hspace{0.05cm} 0}}\right.} \hspace{0.2cm} \sqrt{I \cdot p \cdot (1-p)} = \sqrt {\lambda}.$$

We can see that for the Poisson distribution $\sigma^2 = m_1 = \lambda$ always holds. In contrast, the moments of the Binomial distribution always fulfill $\sigma^2 < m_1$.

Moments of Poisson Distribution

$\text{Example 2:}$  We now compare the Binomial distribution with parameters $I =6$ und $p = 0.4$ with the Poisson distribution with $λ = 2.4$:

  • Both distributions have the same linear average $m_1 = 2.4$.
  • The standard deviation of the Poisson distribution (marked red in the figure) is $σ ≈ 1.55$.
  • The standard deviation of the Binomial distribution (marked blue) is $σ = 1.2$.


Applications of the Poisson Distribution

The Poisson distribution is the result of a so-called Poisson point process which is often used as a model for a series of events that may occur at random times. Examples of such events are

  • failure of devices - an important task in reliability theory,
  • shot noise in the optical transmission simulations, and
  • the start of conversations in a telephone relay center („Teletraffic engineering”).


$\text{Example 3:}$  A telephone relay receives ninety requests per minute on average $(λ = 1.5 \text{ per second})$. The probabilities $p_µ$, that in an arbitrarily large time frame exactly $\mu$ requests are received, is:

$$p_\mu = \frac{1.5^\mu}{\mu!}\cdot {\rm e}^{-1.5}.$$

The resulting numerical values are $p_0 = 0.223$, $p_1 = 0.335$, $p_2 = 0.251$, etc.

From this, additional parameters can be derived:

  • The distance $τ$ between two requests satisfies the "exponential distribution",
  • The mean time span between two requests is ${\rm E}[τ] = 1/λ ≈ 0.667 \ \rm s$.


Comparison of Binomial and Poisson Distribution

This section deals with the similarities and differences between Binomial and Poisson distributions.

Binomial vs. Poisson distribution

The Binomial distribution is used to describe stochastic events, that have a fixed period $T$. For example the period of an ISDN (Integrated Services Digital Network) network with $64 \ \rm kbit/s$ is $T \approx 15.6 \ \rm \mu s$.

  • Binary events such as the error-free $(e_i = 0)$/ faulty $(e_i = 1)$ transmission of individual symbols only occur in this time frame.
  • With the Binomial distribution, it is possible to make statistical statements about the number of expected erros in a period $T_{\rm I} = I · T$, as is shown in the time figure above (marked blue).
  • For very large values of $I$ and very small values of $p$, the Binomial distribution can be approximated by the Poisson distribution with rate $\lambda = I \cdot p$.
  • If at the same time $I · p \gg 1$, the Poisson distribution as well as the Binomial distribution turn into a discrete Gaussian distribution according to the de Moivre-Laplace Theorem.


The Poisson distribution can also be used to make statements about the number of occuring binary events in a finite time interval.

By assuming the same observation period $T_{\rm I}$ and increasing the number of partial periods $I$, the period $T$, in which a new event ($0$ or $1$) can occur, gets smaller and smaller. In the limit where $T$ goes to zero, this means:

  • With the Poisson distribution binary events can not only occur at certain given times, but at any time, which is illustrated in the second time chart.
  • In order to get the same number of "ones" in the period $T_{\rm I}$ - in average - as in the Binomial distribution (six pulses in the example), the characteristic probability $p = {\rm Pr}( e_i = 1)$ for an infinitesimal small time interval $T$ must go to zero.


Exercises

  • First select the number  $(1,\text{...}, 7)$  of the exercise.  The number  $0$  corresponds to a "Reset":  Same setting as at program start.
  • A task description is displayed.  The parameter values ​​are adjusted.  Solution after pressing "Show solution".
  • In these exercises, the term  Blue  refers to distribution function 1 (marked blue in the applet) and the term  Red  refers to distribution function 2 (marked red in applet).


(1)  Set Blue to Binomial distribution $(I=5, \ p=0.4)$ and Red to Binomial distribution $(I=10, \ p=0.2)$.

What are the probabilities ${\rm Pr}(z=0)$ and ${\rm Pr}(z=1)$?


$\hspace{1.0cm}\Rightarrow\hspace{0.3cm}\text{Blue: }{\rm Pr}(z=0)=0.6^5=7.78\%, \hspace{0.3cm}{\rm Pr}(z=1)=0.4 \cdot 0.6^4=25.92\%;$

$\hspace{1.85cm}\text{Red: }{\rm Pr}(z=0)=0.8^{10}=10.74\%, \hspace{0.3cm}{\rm Pr}(z=1)=0.2 \cdot 0.8^9=26.84\%.$

(2)  Using the same settings as in (1), what are the probabilities ${\rm Pr}(3 \le z \le 5)$?


$\hspace{1.0cm}\Rightarrow\hspace{0.3cm}\text{Note that }{\rm Pr}(3 \le z \le 5) = {\rm Pr}(z=3) + {\rm Pr}(z=4) + {\rm Pr}(z=5)\text{, or } {\rm Pr}(3 \le z \le 5) = {\rm Pr}(z \le 5) - {\rm Pr}(z \le 2)$

$\hspace{1.85cm}\text{Blue: }{\rm Pr}(3 \le z \le 5) = 0.2304+ 0.0768 + 0.0102 =1 - 0.6826 = 0.3174;$

$\hspace{1.85cm}\text{Red: }{\rm Pr}(3 \le z \le 5) = 0.2013 + 0.0881 + 0.0264 = 0.9936 - 0.6778 = 0.3158.$

(3)  Using the same settings as in (1), what are the differences in the linear average $m_1$ and the standard deviation $\sigma$ between the two Binomial distributions?


$\hspace{1.0cm}\Rightarrow\hspace{0.3cm}\text{Average:}\hspace{0.2cm}m_\text{1} = I \cdot p\hspace{0.3cm} \Rightarrow\hspace{0.3cm} m_\text{1, Blue} = 5 \cdot 0.4\underline{ = 2 =} \ m_\text{1, Red} = 10 \cdot 0.2; $

$\hspace{1.85cm}\text{Standard deviation:}\hspace{0.4cm}\sigma = \sqrt{I \cdot p \cdot (1-p)} = \sqrt{m_1 \cdot (1-p)}\hspace{0.3cm}\Rightarrow\hspace{0.3cm} \sigma_{\rm Blue} = \sqrt{2 \cdot 0.6} =1.095 < \sigma_{\rm Red} = \sqrt{2 \cdot 0.8} = 1.265.$

(4)  Set Blue to Binomial distribution $(I=15, p=0.3)$ and Red to Poisson distribution $(\lambda=4.5)$.

What differences arise between both distributions regarding the average $m_1$ and variance $\sigma^2$?


$\hspace{1.0cm}\Rightarrow\hspace{0.3cm}\text{Both distributions have the same average:}\hspace{0.2cm}m_\text{1, Blue} = I \cdot p\ = 15 \cdot 0.3\hspace{0.15cm}\underline{ = 4.5 =} \ m_\text{1, Red} = \lambda$;

$\hspace{1.85cm} \text{Binomial distribution: }\hspace{0.2cm} \sigma_\text{Blue}^2 = m_\text{1, Blue} \cdot (1-p)\hspace{0.15cm}\underline { = 3.15} < \text{Poisson distribution: }\hspace{0.2cm} \sigma_\text{Red}^2 = \lambda\hspace{0.15cm}\underline { = 4.5}$;

(5)  Using the same settings as in (4), what are the probabilities ${\rm Pr}(z \gt 10)$ and ${\rm Pr}(z \gt 15)$?


$\hspace{1.0cm}\Rightarrow\hspace{0.3cm} \text{Binomial: }\hspace{0.2cm} {\rm Pr}(z \gt 10) = 1 - {\rm Pr}(z \le 10) = 1 - 0.9993 = 0.0007;\hspace{0.3cm} {\rm Pr}(z \gt 15) = 0 \ {\rm (exactly)}$.

$\hspace{1.85cm}\text{Poisson: }\hspace{0.2cm} {\rm Pr}(z \gt 10) = 1 - 0.9933 = 0.0067;\hspace{0.3cm}{\rm Pr}(z \gt 15) \gt 0\hspace{0.2cm}( \approx 0)$;

$\hspace{1.85cm}\text{Approximation: }\hspace{0.2cm}{\rm Pr}(z \gt 15) \ge {\rm Pr}(z = 16) = \lambda^{16} /{16!}\approx 2 \cdot 10^{-22}$

(6)  Using the same settings as in (4), which parameters lead to a symmetric distribution around $m_1$?


$\hspace{1.0cm}\Rightarrow\hspace{0.3cm} \text{Binomial distribution with }p = 0.5\text{: }p_\mu = {\rm Pr}(z = \mu)\text{ symmetric around } m_1 = I/2 = 7.5 \ ⇒ \ p_μ = p_{I–μ}\ ⇒ \ p_8 = p_7, \ p_9 = p_6, \text{etc.}$

$\hspace{1.85cm}\text{In contrast, the Poisson distribution is never symmetric, since it extends to infinity!}$


Applet Manual

Handhabung binomial.png

    (A)     Preselection for blue parameter set

    (B)     Parameter input: Sliders $I$ and $p$

    (C)     Preselection for Red parameter set

    (D)     Parameter input: Slider $\lambda$

    (E)     Graphic display of the Distribution

    (F)     Output of moments for blue parameter set

    (G)     Output of moments for redparameter set

    (H)     Variation possibilities for the graphic display

$\hspace{1.5cm}$"$+$" (Zoom in),

$\hspace{1.5cm}$ "$-$" (Zoom out)

$\hspace{1.5cm}$ "$\rm o$" (Reset)

$\hspace{1.5cm}$ "$\leftarrow$" (Move left), etc.

    ( I )     Output of ${\rm Pr} (z = \mu)$ and ${\rm Pr} (z \le \mu)$

    (J)     Exercises: Exercise selection, description and solution

Other options for graphic display:

  • Hold shift and scroll: Zoom in on/out of coordinate system,
  • Hold shift and left click: Move the coordinate system.

About the Authors

This interactive calculation was designed and realized at the  Lehrstuhl für Nachrichtentechnik  of the  Technische Universität München.

  • The original version was created in 2003 by  Ji Li as part of her Diploma thesis using "FlashMX–Actionscript"  (Supervisor:  Günter Söder).
  • In 2018 this Applet was redesigned and updated to "HTML5" by  Jimmy He  as part of his Bachelor's thesis (Supervisor:  Tasnád Kernetzky) .

Once again: Open Applet in new Tab

Open Applet in new Tab         English Applet with German WIKI description