Difference between revisions of "Theory of Stochastic Signals/Poisson Distribution"

From LNTwww
 
(18 intermediate revisions by 4 users not shown)
Line 2: Line 2:
 
{{Header
 
{{Header
 
|Untermenü=Discrete Random Variables
 
|Untermenü=Discrete Random Variables
|Vorherige Seite=Binomial distribution
+
|Vorherige Seite=Binomial Distribution
|Nächste Seite=Erzeugung von diskreten Zufallsgrößen
+
|Nächste Seite=Generation of Discrete Random Variables
 
}}
 
}}
 
==Probabilities of the Poisson distribution==
 
==Probabilities of the Poisson distribution==
 
<br>
 
<br>
 
{{BlaueBox|TEXT=   
 
{{BlaueBox|TEXT=   
$\text{Definition:}$&nbsp; The&nbsp; '''Poisson distribution'''&nbsp; is a limiting case of the&nbsp; [[Theory_of_Stochastic_Signals/Binomial_Distribution#General_description_of_the_binomial_distribution|binomial distribution]], where.
+
$\text{Definition:}$&nbsp; The&nbsp; &raquo;'''Poisson distribution'''&laquo;&nbsp; is a limiting case of the&nbsp; [[Theory_of_Stochastic_Signals/Binomial_Distribution#General_description_of_the_binomial_distribution|&raquo;binomial distribution&raquo;]],&nbsp; where  
*on the one hand, the limit transitions&nbsp; $I → ∞$&nbsp; and&nbsp; $p → 0$&nbsp; are assumed,  
+
*on the one hand,&nbsp; the limit transitions&nbsp; $I → ∞$&nbsp; and&nbsp; $p → 0$&nbsp; are assumed,
*additionally, it is assumed that the product&nbsp; $I · p = λ$&nbsp; has a finite value.
+
 +
*additionally,&nbsp; it is assumed that the product has following value:
 +
:$$I · p =\it λ.$$  
  
 +
The parameter&nbsp; $\it \lambda $&nbsp; gives the average number of&nbsp; &raquo;ones&laquo;&nbsp; in a fixed time unit  and is called the&nbsp; &raquo;'''rate'''&laquo;. }}
  
The parameter&nbsp; $λ$&nbsp; gives the average number of "ones" in a fixed unit of time and is called the&nbsp; '''rate'''&nbsp;. }}
 
  
 +
$\text{Further,&nbsp; it should be noted:}$
 +
*In contrast to the binomial distribution&nbsp; $(0 ≤ μ ≤ I)$&nbsp; here the random quantity can take on arbitrarily large&nbsp; $($integer,&nbsp; non-negative$)$&nbsp; values.
  
Further, it should be noted:
+
*This means that the set of possible values is uncountable here.
*In contrast to the binomial distribution&nbsp; $(0 ≤ μ ≤ I)$&nbsp; here the random quantity can take on arbitrarily large (integer, non-negative) values.
+
*This means that the set of possible values here is uncountable.  
+
*But since no intermediate values can occur,&nbsp; this is also called a&nbsp; &raquo;discrete distribution&laquo;.  
*But since no intermediate values can occur, this is also called a&nbsp; ''discrete distribution''.  
 
  
  
 
{{BlueBox|TEXT=   
 
{{BlueBox|TEXT=   
$\text{Calculation rule:}$&nbsp;
+
$\text{Calculation rule:}$&nbsp; Considering above limit transitions for the&nbsp; [[Theory_of_Stochastic_Signals/Binomial_Distribution#Probabilities_of_the_binomial_distribution|&raquo;probabilities of the binomial distribution&laquo;]],&nbsp; it follows for the&nbsp; &raquo;'''Poisson distribution probabilities'''&laquo;:  
 
Considering above limit transitions for the&nbsp; [[Theory_of_Stochastic_Signals/Binomial_Distribution#Probabilities_of_the_binomial_distribution|Probabilities of the binomial distribution]],&nbsp; it follows for the&nbsp; '''Probabilities of Poisson Distribution''' :  
 
 
:$$p_\mu = {\rm Pr} ( z=\mu ) = \lim_{I\to\infty} \cdot \frac{I !}{\mu ! \cdot (I-\mu )!} \cdot (\frac{\lambda}{I} )^\mu \cdot ( 1-\frac{\lambda}{I})^{I-\mu}.$$
 
:$$p_\mu = {\rm Pr} ( z=\mu ) = \lim_{I\to\infty} \cdot \frac{I !}{\mu ! \cdot (I-\mu )!} \cdot (\frac{\lambda}{I} )^\mu \cdot ( 1-\frac{\lambda}{I})^{I-\mu}.$$
From this, after some algebraic transformations, we obtain:  
+
From this,&nbsp; after some algebraic transformations:  
 
:$$p_\mu = \frac{ \lambda^\mu}{\mu!}\cdot {\rm e}^{-\lambda}.$$}}
 
:$$p_\mu = \frac{ \lambda^\mu}{\mu!}\cdot {\rm e}^{-\lambda}.$$}}
  
  
[[File: EN_Sto_T_2_4_S1_neu.png |frame| Probabilities of the Poisson distribution | right]]
+
{{GraueBox|TEXT=
{{GraueBox|TEXT=  
+
[[File: EN_Sto_T_2_4_S1_neu.png |frame| Binomial and Poisson probabilities| right]]   
$\text{Example 1:}$&nbsp; The probabilities  
+
$\text{Example 1:}$&nbsp; In the graph on the right can be seen the probabilities of
*of the binomial distribution with&nbsp; $I =6$, $p = 0.4$,&nbsp; and
+
*binomial distribution with&nbsp; $I =6$,&nbsp; $p = 0.4$,&nbsp; $($blue arrows and labels$)$&nbsp;
*of the Poisson distribution with&nbsp; $λ = 2.4$  
+
 
 +
*Poisson distribution with&nbsp; $λ = 2.4$ &nbsp; $($red arrows and labels$)$.&nbsp;
  
  
can be seen in the graph on the right. One can see:  
+
You can recognize:  
*Both distributions have the same mean&nbsp; $m_1 = 2.4$.  
+
#Both distributions have the same mean&nbsp; $m_1 = 2.4$.
*In the Poisson distribution&nbsp; (red arrows and labels)&nbsp; the "outer values" are more probable than in the binomial distribution.  
+
#In the binomial distribution,&nbsp; all probabilities&nbsp; ${\rm Pr}(z > 6) \equiv 0$. 
*In addition, random variables&nbsp; $z > 6$&nbsp; are also possible with the Poisson distribution;&nbsp; but their probabilities are also rather small at the chosen rate. }}
+
#In the Poisson distribution the outer values&nbsp; are more probable than with the binomial distribution.  
 +
#Random variables&nbsp; $z > 6$&nbsp; are also possible with the Poisson distribution.&nbsp;  
 +
#But their probabilities are also rather small at the chosen rate. }}
  
  
==Momente der Poissonverteilung==
+
==Moments of the Poisson distribution==
 
<br>
 
<br>
{{BlaueBox|TEXT=   
+
{{BlueBox|TEXT=   
$\text{Berechnungsvorschrift:}$&nbsp;  
+
$\text{Calculation rule:}$&nbsp; &raquo;'''Mean'''&laquo;&nbsp; and&nbsp; &raquo;'''standard deviation'''&laquo;&nbsp;  are obtained directly from the&nbsp; [[Theory_of_Stochastic_Signals/Binomial_Distribution#Moments_of_the_binomial_distribution|&raquo;corresponding equations of the binomial distribution&laquo;]]&nbsp; by twofold limiting:
 
 
Mittelwert und Streuung der Poissonverteilung ergeben sich direkt aus den&nbsp; [[Theory_of_Stochastic_Signals/Binomialverteilung#Momente_der_Binomialverteilung|entsprechenden Gleichungen der Binomialverteilung]]&nbsp; durch zweifache Grenzwertbildung:
 
 
:$$m_1 =\lim_{\left.{I\hspace{0.05cm}\to\hspace{0.05cm}\infty \atop {p\hspace{0.05cm}\to\hspace{0.05cm} 0} }\right.} I \cdot p= \lambda,$$
 
:$$m_1 =\lim_{\left.{I\hspace{0.05cm}\to\hspace{0.05cm}\infty \atop {p\hspace{0.05cm}\to\hspace{0.05cm} 0} }\right.} I \cdot p= \lambda,$$
 
:$$\sigma =\lim_{\left.{I\hspace{0.05cm}\to\hspace{0.05cm}\infty \atop {p\hspace{0.05cm}\to\hspace{0.05cm} 0} }\right.} \sqrt{I \cdot p \cdot (1-p)} = \sqrt {\lambda}.$$
 
:$$\sigma =\lim_{\left.{I\hspace{0.05cm}\to\hspace{0.05cm}\infty \atop {p\hspace{0.05cm}\to\hspace{0.05cm} 0} }\right.} \sqrt{I \cdot p \cdot (1-p)} = \sqrt {\lambda}.$$
  
Daraus ist ersichtlich, dass bei der Poissonverteilung stets&nbsp; $σ^2 = m_1 = λ$&nbsp; gilt. }}
+
From this it can be seen that with the Poisson distribution the variance is always&nbsp;  
 +
:$$σ^2 = m_1 = λ.$$ }}
  
  
[[File: P_ID616__Sto_T_2_4_S2neu.png |frame| Momente der Poissonverteilung | rechts]]
+
{{GraueBox|TEXT=
{{GraueBox|TEXT=  
+
[[File: P_ID616__Sto_T_2_4_S2neu.png |frame| Moments of the Poisson distribution | right]]   
$\text{Beispiel 2:}$&nbsp;
+
$\text{Example 2:}$&nbsp; As in&nbsp; $\text{Example 1}$,&nbsp; here we compare
 +
*the binomial distribution with&nbsp; $I =6$,&nbsp; $p = 0.4$,&nbsp; $($blue arrows and labels$)$&nbsp; and
 +
 
 +
*the Poisson distribution with&nbsp; $λ = 2.4$ &nbsp; $($red arrows and labels$)$.&nbsp;
  
Wie im&nbsp; $\text{Beispiel 1}$&nbsp; werden hier miteinander verglichen:
 
*die Binomialverteilung mit&nbsp; $I =6$,&nbsp; $p = 0.4$,&nbsp; und
 
*und die Poissonverteilung mit&nbsp; $λ = 2.4$
 
  
 +
One can see from the accompanying sketch:
  
Man erkennt aus der nebenstehenden Skizze:
+
#Both distributions have exactly the same mean&nbsp; $m_1 = 2.4$.
 +
#For the red Poisson distribution,&nbsp; the standard deviation&nbsp; $σ ≈ 1.55$.
 +
#In contrast,&nbsp; for the blue binomial distribution,&nbsp; the standard deviation is only&nbsp; $σ = 1.2$.
  
*Beide Verteilungen besitzen genau den gleichen Mittelwert&nbsp; $m_1 = 2.4$.
 
*Bei der Poissonverteilung (im Bild rot markiert) beträgt die Streuung&nbsp; $σ ≈ 1.55$.
 
*Bei der (blauen) Binomialverteilung ist die Standardabweichung dagegen nur&nbsp; $σ = 1.2$.}}
 
  
 +
&rArr; &nbsp; With the interactive HTML 5/JavaScript applet&nbsp; [[Applets:Binomial_and_Poisson_Distribution_(Applet)|&raquo;Binomial and Poisson Distribution&raquo;]],&nbsp; you can
 +
*determine the probabilities and moments of the Poisson distribution for any&nbsp; $λ$-values
 +
 +
*and visualize the similarities and differences compared to the binomial distribution.}}
  
Mit dem interaktiven Applet&nbsp; [[Applets:Binomial-_und_Poissonverteilung_(Applet)|Binomial&ndash; und Poissonverteilung]]&nbsp; können Sie die Wahrscheinlichkeiten und Mittelwerte (Momente) der Poissonverteilung für beliebige&nbsp; $λ$–Werte ermitteln und sich die Gemeinsamkeiten und Unterschiede gegenüber der Binomialverteilung verdeutlichen.
 
  
 +
==Comparison of binomial distribution vs. Poisson distribution==
 +
<br>
 +
Now both the similarities and the differences between binomial and Poisson distributed random variables shall be worked out again.
  
==Gegenüberstellung Binomialverteilung vs. Poissonverteilung==
+
The&nbsp; &raquo;'''binomial distribution'''&laquo;&nbsp; is suitable for the description of such stochastic events,&nbsp; which are characterized by a given clock&nbsp; $T$. &nbsp; For example,&nbsp; for&nbsp; [[Examples_of_Communication_Systems/General_Description_of_ISDN|'''ISDN''']]&nbsp; $($&raquo;Integrated Services Digital Network&raquo;$)$&nbsp; with&nbsp; $64 \ \rm kbit/s$ &nbsp; &rArr; &nbsp; the clock time&nbsp; $T \approx 15.6 \ \rm &micro; s$.
<br>
+
[[File:  EN_Sto_T_2_4_S3.png |right|frame| Binomial distribution&nbsp; $($blue$)$&nbsp; vs. Poisson distribution&nbsp; $($red$)$]]
Nun sollen sowohl die Gemeinsamkeiten als auch die Unterschiede zwischen binomial&ndash; und poissonverteilten Zufallsgrößen nochmals herausgearbeitet werden.  
 
  
Die&nbsp; '''Binomialverteilung'''&nbsp; ist zur Beschreibung von solchen stochastischen Ereignissen geeignet, die durch einen vorgegebenen Takt&nbsp; $T$&nbsp; gekennzeichnet sind.&nbsp; Beispielsweise beträgt bei&nbsp; [[Examples_of_Communication_Systems/Allgemeine_Beschreibung_von_ISDN|ISDN]]&nbsp;  (''Integrated Services Digital Network'')&nbsp; mit&nbsp; $64 \ \rm kbit/s$&nbsp; die Taktzeit&nbsp; $T \approx 15.6 \ \rm &micro; s$.
 
*Nur in diesem Zeitraster treten binäre Ereignisse auf.&nbsp; Solche Ereignisse sind zum Beispiel die fehlerfreie&nbsp; $(e_i = 0)$&nbsp; oder fehlerhafte&nbsp; $(e_i = 1)$&nbsp; Übertragung einzelner Symbole.
 
*Die Binomialverteilung ermöglicht nun statistische Aussagen über die Anzahl der in einem längeren Zeitintervall&nbsp; $T_{\rm I} = I · T$&nbsp; zu erwartenden Übertragungsfehler entsprechend dem oberen Diagramm der folgenden Grafik (blau markierte Zeitpunkte).
 
  
 +
#'''Binary events only occur in this time grid'''.&nbsp; Such events are,&nbsp; for example,&nbsp; error-free&nbsp; $(e_i = 0)$&nbsp; or errored&nbsp; $(e_i = 1)$&nbsp; transmission of individual symbols.
 +
#The binomial distribution now allows statistical statements about the number of transmission errors to be expected in a longer time interval&nbsp; $T_{\rm I} = I ⋅ T$&nbsp; according to the upper diagram of the graph&nbsp; $($time marked in blue$)$.
  
[[File:  EN_Sto_T_2_4_S3.png |center|frame| Schema für Binomialverteilung und Poissonverteilung]]
 
  
Auch die&nbsp; '''Poissonverteilung'''&nbsp; macht Aussagen über die Anzahl eintretender Binärereignisse in einem endlichen Zeitintervall:  
+
Also the&nbsp; &raquo;'''Poisson distribution'''&laquo;&nbsp; makes statements about the number of occurring binary events in a finite time interval:  
*Geht man hierbei vom gleichen Betrachtungszeitraum&nbsp; $T_{\rm I}$&nbsp; aus und vergrößert die Anzahl&nbsp; $I$&nbsp; der Teilintervalle immer mehr, so wird die Taktzeit&nbsp; $T$, zu der jeweils ein neues Binärereignis&nbsp; („0” oder „1”)&nbsp; eintreten kann, immer kleiner.&nbsp; Im Grenzfall geht&nbsp; $T$&nbsp; gegen Null.  
+
#If one assumes the same observation period&nbsp; $T_{\rm I}$&nbsp; and increases the number&nbsp; $I$&nbsp; of subintervals more and more,&nbsp; then the clock time&nbsp; $T$,&nbsp; at which a new binary event&nbsp; $(0$&nbsp; or&nbsp; $1)$&nbsp; can occur,&nbsp; becomes smaller and smaller.&nbsp; In the limiting case:&nbsp; $T \to 0$.  
*Das heißt:&nbsp; Bei der Poissonverteilung sind die binären Ereignisse nicht nur zu diskreten, durch ein Zeitraster vorgegebenen Zeitpunkten möglich, sondern jederzeit.&nbsp; Das untere Zeitdiagramm verdeutlicht diesen Sachverhalt.
+
#This means:&nbsp; In the Poisson distribution,&nbsp; '''the binary events are possible'''&nbsp; not only at discrete time  points given by a time grid,&nbsp; but&nbsp; '''at any time'''.&nbsp;
*Um im Mittel während der Zeit&nbsp; $T_{\rm I}$&nbsp; genau so viele „Einsen” wie bei der Binomialverteilung zu erhalten&nbsp; (im Beispiel:&nbsp; sechs), muss allerdings die auf das infinitesimal kleine Zeitintervall&nbsp; $T$&nbsp; bezogene charakteristische Wahrscheinlichkeit&nbsp; $p = {\rm Pr}( e_i = 1)$&nbsp; gegen Null tendieren.  
+
#In order to obtain during time&nbsp; $T_{\rm I}$&nbsp; on average exactly as many&nbsp; on average &raquo;ones&laquo;&nbsp; as in the binomial distribution&nbsp; $($in the example:&nbsp; six$)$,&nbsp; the characteristic probability related to the infinitesimally small  interval&nbsp; $T$&nbsp; must tend to&nbsp; $p = {\rm Pr}( e_i = 1)=0$.  
  
  
==Anwendungen der Poissonverteilung==
+
==Applications of the Poisson distribution==
 
<br>
 
<br>
Die Poissonverteilung ist das Ergebnis eines so genannten&nbsp; [https://de.wikipedia.org/wiki/Poisson-Prozess Poissonprozesses].&nbsp; Ein solcher dient häufig als Modell für Ereignisfolgen, die zu zufälligen Zeitpunkten eintreten können.&nbsp; Beispiele für derartige Ereignisse sind
+
The Poisson distribution is the result of a so-called&nbsp; [https://en.wikipedia.org/wiki/Poisson_point_process &raquo;Poisson process&laquo;].&nbsp; Such a process is often used as a model for sequences of events that may occur at random times.&nbsp; Examples of such events include
*der Ausfall von Geräten – eine wichtige Aufgabenstellung in der Zuverlässigkeitstheorie,  
+
#the prediction of equipmente failure &ndash; an important task in reliability theory,
*das Schrotrauschen bei der optischen Übertragung, und
+
#the shot noise in optical transmission,&nbsp;  and
*der Beginn von Telefongesprächen in einer Vermittlungsstelle&nbsp; („Verkehrstheorie”).  
+
#the start of telephone calls in a switching center&nbsp; $($&raquo;teletraffic engineering&laquo;$)$.  
  
  
 
{{GraueBox|TEXT=   
 
{{GraueBox|TEXT=   
$\text{Beispiel 3:}$&nbsp; Gehen bei einer Vermittlungsstelle im Langzeitmittel neunzig Vermittlungswünsche pro Minute&nbsp; $($also&nbsp; $λ = 1.5 \text{ pro Sekunde})$&nbsp; ein, so lauten die Wahrscheinlichkeiten&nbsp; $p_\mu$, dass in einem beliebigen Zeitraum von einer Sekunde genau&nbsp; $\mu$&nbsp; Belegungen auftreten:  
+
$\text{Example 3:}$&nbsp; If ninety switching requests per minute&nbsp; $(λ = 1.5 \text{ per second})$&nbsp; are received by a switching center on a long&ndash;term average,&nbsp; the probabilities&nbsp; $p_\mu$&nbsp; that exactly&nbsp; $\mu$&nbsp; connections occur in any one-second period:  
 
:$$p_\mu = \frac{1.5^\mu}{\mu!}\cdot {\rm e}^{-1.5}.$$
 
:$$p_\mu = \frac{1.5^\mu}{\mu!}\cdot {\rm e}^{-1.5}.$$
  
Es ergeben sich die Zahlenwerte &nbsp;$p_0 = 0.223$, &nbsp;$p_1 = 0.335$, &nbsp;$p_2 = 0.251$, usw.  
+
This gives the numerical values &nbsp;$p_0 = 0.223$, &nbsp; $p_1 = 0.335$, &nbsp; $p_2 = 0.251$, etc.  
 +
 
 +
From this,&nbsp; further characteristics can be derived:
 +
*The distance&nbsp; $τ$&nbsp; between two connection requests satisfies the&nbsp; [[Theory_of_Stochastic_Signals/Exponentially_Distributed_Random_Variables#One-sided_exponential_distribution|&raquo;exponential distribution&laquo;]].
 +
 
 +
*So,&nbsp; the mean time interval between two connection requests is&nbsp; ${\rm E}[\hspace{0.05cm}τ\hspace{0.05cm}] = 1/λ ≈ 0.667 \ \rm s$.}}
  
Daraus lassen sich weitere Kenngrößen ableiten:
 
*Die Abstand&nbsp; $τ$&nbsp; zwischen zwei Vermittlungswünschen genügt der&nbsp; [[Theory_of_Stochastic_Signals/Exponentialverteilte_Zufallsgrößen#Einseitige_Exponentialverteilung|Exponentialverteilung]].
 
*Die mittlere Zeitspanne zwischen zwei Vermittlungswünschen beträgt&nbsp; ${\rm E}[\hspace{0.05cm}τ\hspace{0.05cm}] = 1/λ ≈ 0.667 \ \rm s$.}}
 
  
==Aufgaben zum Kapitel==
+
==Exercises for the chapter==
 
<br>
 
<br>
[[Aufgaben:2.5 „Binomial” oder „Poisson”?|Aufgabe 2.5: „Binomial” oder „Poisson”?]]
+
[[Aufgaben:Exercise_2.5:_"Binomial"_or_"Poisson"%3F|Exercise 2.5: "Binomial" or "Poisson"?]]
  
[[Aufgaben:2.5Z_Blumenwiese|Aufgabe 2.5Z: Blumenwiese]]
+
[[Aufgaben:Exercise_2.5Z:_Flower_Meadow|Exercise 2.5Z: Flower Meadow]]
  
  
 
{{Display}}
 
{{Display}}

Latest revision as of 18:58, 7 February 2024

Probabilities of the Poisson distribution


$\text{Definition:}$  The  »Poisson distribution«  is a limiting case of the  »binomial distribution»,  where

  • on the one hand,  the limit transitions  $I → ∞$  and  $p → 0$  are assumed,
  • additionally,  it is assumed that the product has following value:
$$I · p =\it λ.$$

The parameter  $\it \lambda $  gives the average number of  »ones«  in a fixed time unit and is called the  »rate«.


$\text{Further,  it should be noted:}$

  • In contrast to the binomial distribution  $(0 ≤ μ ≤ I)$  here the random quantity can take on arbitrarily large  $($integer,  non-negative$)$  values.
  • This means that the set of possible values is uncountable here.
  • But since no intermediate values can occur,  this is also called a  »discrete distribution«.


$\text{Calculation rule:}$  Considering above limit transitions for the  »probabilities of the binomial distribution«,  it follows for the  »Poisson distribution probabilities«:

$$p_\mu = {\rm Pr} ( z=\mu ) = \lim_{I\to\infty} \cdot \frac{I !}{\mu ! \cdot (I-\mu )!} \cdot (\frac{\lambda}{I} )^\mu \cdot ( 1-\frac{\lambda}{I})^{I-\mu}.$$

From this,  after some algebraic transformations:

$$p_\mu = \frac{ \lambda^\mu}{\mu!}\cdot {\rm e}^{-\lambda}.$$


Binomial and Poisson probabilities

$\text{Example 1:}$  In the graph on the right can be seen the probabilities of

  • binomial distribution with  $I =6$,  $p = 0.4$,  $($blue arrows and labels$)$ 
  • Poisson distribution with  $λ = 2.4$   $($red arrows and labels$)$. 


You can recognize:

  1. Both distributions have the same mean  $m_1 = 2.4$.
  2. In the binomial distribution,  all probabilities  ${\rm Pr}(z > 6) \equiv 0$.
  3. In the Poisson distribution the outer values  are more probable than with the binomial distribution.
  4. Random variables  $z > 6$  are also possible with the Poisson distribution. 
  5. But their probabilities are also rather small at the chosen rate.


Moments of the Poisson distribution


$\text{Calculation rule:}$  »Mean«  and  »standard deviation«  are obtained directly from the  »corresponding equations of the binomial distribution«  by twofold limiting:

$$m_1 =\lim_{\left.{I\hspace{0.05cm}\to\hspace{0.05cm}\infty \atop {p\hspace{0.05cm}\to\hspace{0.05cm} 0} }\right.} I \cdot p= \lambda,$$
$$\sigma =\lim_{\left.{I\hspace{0.05cm}\to\hspace{0.05cm}\infty \atop {p\hspace{0.05cm}\to\hspace{0.05cm} 0} }\right.} \sqrt{I \cdot p \cdot (1-p)} = \sqrt {\lambda}.$$

From this it can be seen that with the Poisson distribution the variance is always 

$$σ^2 = m_1 = λ.$$


Moments of the Poisson distribution

$\text{Example 2:}$  As in  $\text{Example 1}$,  here we compare

  • the binomial distribution with  $I =6$,  $p = 0.4$,  $($blue arrows and labels$)$  and
  • the Poisson distribution with  $λ = 2.4$   $($red arrows and labels$)$. 


One can see from the accompanying sketch:

  1. Both distributions have exactly the same mean  $m_1 = 2.4$.
  2. For the red Poisson distribution,  the standard deviation  $σ ≈ 1.55$.
  3. In contrast,  for the blue binomial distribution,  the standard deviation is only  $σ = 1.2$.


⇒   With the interactive HTML 5/JavaScript applet  »Binomial and Poisson Distribution»,  you can

  • determine the probabilities and moments of the Poisson distribution for any  $λ$-values
  • and visualize the similarities and differences compared to the binomial distribution.


Comparison of binomial distribution vs. Poisson distribution


Now both the similarities and the differences between binomial and Poisson distributed random variables shall be worked out again.

The  »binomial distribution«  is suitable for the description of such stochastic events,  which are characterized by a given clock  $T$.   For example,  for  ISDN  $($»Integrated Services Digital Network»$)$  with  $64 \ \rm kbit/s$   ⇒   the clock time  $T \approx 15.6 \ \rm µ s$.

Binomial distribution  $($blue$)$  vs. Poisson distribution  $($red$)$


  1. Binary events only occur in this time grid.  Such events are,  for example,  error-free  $(e_i = 0)$  or errored  $(e_i = 1)$  transmission of individual symbols.
  2. The binomial distribution now allows statistical statements about the number of transmission errors to be expected in a longer time interval  $T_{\rm I} = I ⋅ T$  according to the upper diagram of the graph  $($time marked in blue$)$.


Also the  »Poisson distribution«  makes statements about the number of occurring binary events in a finite time interval:

  1. If one assumes the same observation period  $T_{\rm I}$  and increases the number  $I$  of subintervals more and more,  then the clock time  $T$,  at which a new binary event  $(0$  or  $1)$  can occur,  becomes smaller and smaller.  In the limiting case:  $T \to 0$.
  2. This means:  In the Poisson distribution,  the binary events are possible  not only at discrete time points given by a time grid,  but  at any time
  3. In order to obtain during time  $T_{\rm I}$  on average exactly as many  on average »ones«  as in the binomial distribution  $($in the example:  six$)$,  the characteristic probability related to the infinitesimally small interval  $T$  must tend to  $p = {\rm Pr}( e_i = 1)=0$.


Applications of the Poisson distribution


The Poisson distribution is the result of a so-called  »Poisson process«.  Such a process is often used as a model for sequences of events that may occur at random times.  Examples of such events include

  1. the prediction of equipmente failure – an important task in reliability theory,
  2. the shot noise in optical transmission,  and
  3. the start of telephone calls in a switching center  $($»teletraffic engineering«$)$.


$\text{Example 3:}$  If ninety switching requests per minute  $(λ = 1.5 \text{ per second})$  are received by a switching center on a long–term average,  the probabilities  $p_\mu$  that exactly  $\mu$  connections occur in any one-second period:

$$p_\mu = \frac{1.5^\mu}{\mu!}\cdot {\rm e}^{-1.5}.$$

This gives the numerical values  $p_0 = 0.223$,   $p_1 = 0.335$,   $p_2 = 0.251$, etc.

From this,  further characteristics can be derived:

  • So,  the mean time interval between two connection requests is  ${\rm E}[\hspace{0.05cm}τ\hspace{0.05cm}] = 1/λ ≈ 0.667 \ \rm s$.


Exercises for the chapter


Exercise 2.5: "Binomial" or "Poisson"?

Exercise 2.5Z: Flower Meadow