Difference between revisions of "Theory of Stochastic Signals/Poisson Distribution"

From LNTwww
Line 8: Line 8:
 
<br>
 
<br>
 
{{BlaueBox|TEXT=   
 
{{BlaueBox|TEXT=   
$\text{Definition:}$&nbsp; The&nbsp; '''Poisson distribution'''&nbsp; is a limiting case of the&nbsp; [[Theory_of_Stochastic_Signals/Binomial_Distribution#General_description_of_the_binomial_distribution|binomial distribution]], where.
+
$\text{Definition:}$&nbsp; The&nbsp; '''Poisson distribution'''&nbsp; is a limiting case of the&nbsp; [[Theory_of_Stochastic_Signals/Binomial_Distribution#General_description_of_the_binomial_distribution|binomial distribution]],&nbsp; where  
*on the one hand, the limit transitions&nbsp; $I → ∞$&nbsp; and&nbsp; $p → 0$&nbsp; are assumed,  
+
*on the one hand,&nbsp; the limit transitions&nbsp; $I → ∞$&nbsp; and&nbsp; $p → 0$&nbsp; are assumed,  
*additionally, it is assumed that the product&nbsp; $I · p = λ$&nbsp; has a finite value.  
+
*additionally,&nbsp; it is assumed that the product&nbsp; $I · p = λ$&nbsp; has a finite value.  
  
  
The parameter&nbsp; $λ$&nbsp; gives the average number of "ones" in a fixed unit of time and is called the&nbsp; '''rate'''&nbsp;. }}
+
The parameter&nbsp; $λ$&nbsp; gives the average number of&nbsp; "ones"&nbsp; in a fixed unit of time and is called the&nbsp; '''rate'''. }}
  
  
Further, it should be noted:  
+
Further,&nbsp; it should be noted:  
*In contrast to the binomial distribution&nbsp; $(0 ≤ μ ≤ I)$&nbsp; here the random quantity can take on arbitrarily large (integer, non-negative) values.
+
*In contrast to the binomial distribution&nbsp; $(0 ≤ μ ≤ I)$&nbsp; here the random quantity can take on arbitrarily large&nbsp; (integer,&nbsp; non-negative)&nbsp; values.
 
*This means that the set of possible values here is uncountable.  
 
*This means that the set of possible values here is uncountable.  
*But since no intermediate values can occur, this is also called a&nbsp; ''discrete distribution''.  
+
*But since no intermediate values can occur,&nbsp; this is also called a&nbsp; "discrete distribution".  
  
  
Line 25: Line 25:
 
$\text{Calculation rule:}$&nbsp;
 
$\text{Calculation rule:}$&nbsp;
 
   
 
   
Considering above limit transitions for the&nbsp; [[Theory_of_Stochastic_Signals/Binomial_Distribution#Probabilities_of_the_binomial_distribution|probabilities of the binomial distribution]],&nbsp; it follows for the&nbsp; '''Probabilities of Poisson Distribution''' :  
+
*Considering above limit transitions for the&nbsp; [[Theory_of_Stochastic_Signals/Binomial_Distribution#Probabilities_of_the_binomial_distribution|probabilities of the binomial distribution]],&nbsp; it follows for the&nbsp; '''Probabilities of Poisson Distribution''':  
 
:$$p_\mu = {\rm Pr} ( z=\mu ) = \lim_{I\to\infty} \cdot \frac{I !}{\mu ! \cdot (I-\mu )!} \cdot (\frac{\lambda}{I} )^\mu \cdot ( 1-\frac{\lambda}{I})^{I-\mu}.$$
 
:$$p_\mu = {\rm Pr} ( z=\mu ) = \lim_{I\to\infty} \cdot \frac{I !}{\mu ! \cdot (I-\mu )!} \cdot (\frac{\lambda}{I} )^\mu \cdot ( 1-\frac{\lambda}{I})^{I-\mu}.$$
From this, after some algebraic transformations, we obtain:  
+
*From this,&nbsp; after some algebraic transformations,&nbsp; we obtain:  
 
:$$p_\mu = \frac{ \lambda^\mu}{\mu!}\cdot {\rm e}^{-\lambda}.$$}}
 
:$$p_\mu = \frac{ \lambda^\mu}{\mu!}\cdot {\rm e}^{-\lambda}.$$}}
  
  
[[File: EN_Sto_T_2_4_S1_neu.png |frame| Probabilities of the Poisson distribution | right]]
+
[[File: EN_Sto_T_2_4_S1_neu.png |frame| Probabilities of the Poisson distribution<br>compared to the binomial probabilities| right]]
 
{{GraueBox|TEXT=   
 
{{GraueBox|TEXT=   
 
$\text{Example 1:}$&nbsp; The probabilities  
 
$\text{Example 1:}$&nbsp; The probabilities  
*of the binomial distribution with&nbsp; $I =6$, $p = 0.4$,&nbsp; and   
+
*of the binomial distribution with&nbsp; $I =6$,&nbsp; $p = 0.4$,&nbsp; and   
 
*of the Poisson distribution with&nbsp; $λ = 2.4$  
 
*of the Poisson distribution with&nbsp; $λ = 2.4$  
  
  
can be seen in the graph on the right. One can see:  
+
can be seen in the graph on the right.&nbsp; You can recognize:  
 
*Both distributions have the same mean&nbsp; $m_1 = 2.4$.  
 
*Both distributions have the same mean&nbsp; $m_1 = 2.4$.  
*In the Poisson distribution&nbsp; (red arrows and labels)&nbsp; the "outer values" are more probable than in the binomial distribution.  
+
*In the Poisson distribution&nbsp; (red arrows and labels)&nbsp; the&nbsp; "outer values"&nbsp; are more probable than in the binomial distribution.  
*In addition, random variables&nbsp; $z > 6$&nbsp; are also possible with the Poisson distribution;&nbsp; but their probabilities are also rather small at the chosen rate. }}
+
*In addition,&nbsp; random variables&nbsp; $z > 6$&nbsp; are also possible with the Poisson distribution,&nbsp; but their probabilities are also rather small at the chosen rate. }}
  
  
Line 49: Line 49:
 
$\text{Calculation rule:}$&nbsp;  
 
$\text{Calculation rule:}$&nbsp;  
  
The mean and rms of the Poisson distribution are obtained directly from the&nbsp; [[Theory_of_Stochastic_Signals/Binomial_Distribution#Moments_of_the_binomial_distribution|corresponding equations of the binomial distribution]]&nbsp; by twofold limiting:
+
*The&nbsp; '''mean'''&nbsp; and the&nbsp; '''rms value'''&nbsp; of the Poisson distribution are obtained directly from the&nbsp; [[Theory_of_Stochastic_Signals/Binomial_Distribution#Moments_of_the_binomial_distribution|corresponding equations of the binomial distribution]]&nbsp; by twofold limiting:
 
:$$m_1 =\lim_{\left.{I\hspace{0.05cm}\to\hspace{0.05cm}\infty \atop {p\hspace{0.05cm}\to\hspace{0.05cm} 0} }\right.} I \cdot p= \lambda,$$
 
:$$m_1 =\lim_{\left.{I\hspace{0.05cm}\to\hspace{0.05cm}\infty \atop {p\hspace{0.05cm}\to\hspace{0.05cm} 0} }\right.} I \cdot p= \lambda,$$
 
:$$\sigma =\lim_{\left.{I\hspace{0.05cm}\to\hspace{0.05cm}\infty \atop {p\hspace{0.05cm}\to\hspace{0.05cm} 0} }\right.} \sqrt{I \cdot p \cdot (1-p)} = \sqrt {\lambda}.$$
 
:$$\sigma =\lim_{\left.{I\hspace{0.05cm}\to\hspace{0.05cm}\infty \atop {p\hspace{0.05cm}\to\hspace{0.05cm} 0} }\right.} \sqrt{I \cdot p \cdot (1-p)} = \sqrt {\lambda}.$$
  
From this it can be seen that in the Poisson distribution it is always&nbsp; $σ^2 = m_1 = λ$&nbsp; . }}
+
*From this it can be seen that in the Poisson distribution the varianc is always&nbsp; $σ^2 = m_1 = λ$. }}
  
  
Line 60: Line 60:
 
$\text{Example 2:}$&nbsp;
 
$\text{Example 2:}$&nbsp;
  
As in&nbsp; $\text{Example 1}$&nbsp;, here we compare:  
+
As in&nbsp; $\text{Example 1}$,&nbsp; here we compare:  
 
*the binomial distribution with&nbsp; $I =6$,&nbsp; $p = 0.4$,&nbsp; and  
 
*the binomial distribution with&nbsp; $I =6$,&nbsp; $p = 0.4$,&nbsp; and  
*and the Poisson distribution with&nbsp; $λ = 2.4$.
+
*the Poisson distribution with&nbsp; $λ = 2.4$.
  
  
Line 68: Line 68:
  
 
*Both distributions have exactly the same mean&nbsp; $m_1 = 2.4$.  
 
*Both distributions have exactly the same mean&nbsp; $m_1 = 2.4$.  
*For the Poisson distribution (marked red in the figure), the dispersion&nbsp; $σ ≈ 1.55$.  
+
*For the Poisson distribution (marked red in the figure),&nbsp; the rms value&nbsp (standard deviation)&nbsp; $σ ≈ 1.55$.  
*In contrast, for the (blue) binomial distribution, the standard deviation is only&nbsp; $σ = 1.2$.}}
+
*In contrast,&nbsp; for the (blue) binomial distribution,&nbsp; the rms value is only&nbsp; $σ = 1.2$.}}
  
  
With the interactive applet&nbsp; [[Applets:Binomial_and_Poisson_Distribution_(Applet)|Binomial&ndash; and Poisson Distribution]]&nbsp; you can determine the probabilities and means (moments) of the Poisson distribution for any&nbsp; $λ$-values and visualize the similarities and differences compared to the binomial distribution.
+
With the interactive HTML 5/JavaScript applet&nbsp; [[Applets:Binomial_and_Poisson_Distribution_(Applet)|"Binomial and Poisson Distribution"]]&nbsp;  
 +
*you can determine the probabilities and means (moments) of the Poisson distribution for any&nbsp; $λ$-values  
 +
*and visualize the similarities and differences compared to the binomial distribution.
  
  
 
==Comparison of binomial distribution vs. Poisson distribution==
 
==Comparison of binomial distribution vs. Poisson distribution==
 
<br>
 
<br>
Now both the similarities and the differences between binomial and poisson distributed random variables shall be worked out again.  
+
Now both the similarities and the differences between binomial and Poisson distributed random variables shall be worked out again.  
  
The&nbsp; '''binomial distribution'''&nbsp; is suitable for the description of such stochastic events, which are characterized by a given clock&nbsp; $T$&nbsp; . &nbsp; For example, for&nbsp; [[Examples_of_Communication_Systems/General_Description_of_ISDN|ISDN]]&nbsp; (''Integrated Services Digital Network'')&nbsp; with&nbsp; $64 \ \rm kbit/s$&nbsp; the clock time&nbsp; $T \approx 15.6 \ \rm &micro; s$.
+
[[File:  EN_Sto_T_2_4_S3.png |right|frame| Scheme for binomial distribution&nbsp; (red)&nbsp; and Poisson distribution&nbsp; (blue)]]
*Only in this time grid do binary events occur&nbsp; Such events are, for example, error-free&nbsp; $(e_i = 0)$&nbsp; or errored&nbsp; $(e_i = 1)$&nbsp; transmission of individual symbols.
 
*The binomial distribution now allows statistical statements about the number of transmission errors to be expected in a longer time interval&nbsp; $T_{\rm I} = I ⋅ T$&nbsp; according to the upper diagram of the following graph (time marked in blue).
 
  
 +
The&nbsp; '''binomial distribution'''&nbsp; is suitable for the description of such stochastic events,&nbsp; which are characterized by a given clock&nbsp; $T$. &nbsp; For example,&nbsp; for&nbsp; [[Examples_of_Communication_Systems/General_Description_of_ISDN|ISDN]]&nbsp; ("Integrated Services Digital Network")&nbsp; with&nbsp; $64 \ \rm kbit/s$ &nbsp; &rArr; &nbsp; the clock time&nbsp; $T \approx 15.6 \ \rm &micro; s$.
 +
*'''Binary events only occur in this time grid'''.&nbsp; Such events are,&nbsp; for example,&nbsp; error-free&nbsp; $(e_i = 0)$&nbsp; or errored&nbsp; $(e_i = 1)$&nbsp; transmission of individual symbols.
 +
*The binomial distribution now allows statistical statements about the number of transmission errors to be expected in a longer time interval&nbsp; $T_{\rm I} = I ⋅ T$&nbsp; according to the upper diagram of the graph&nbsp; (time marked in blue).
  
[[File:  EN_Sto_T_2_4_S3.png |center|frame| Scheme for binomial distribution and Poisson distribution]]
 
  
 
Also the&nbsp; '''Poisson distribution'''&nbsp; makes statements about the number of occurring binary events in a finite time interval:  
 
Also the&nbsp; '''Poisson distribution'''&nbsp; makes statements about the number of occurring binary events in a finite time interval:  
*If one assumes here the same observation period&nbsp; $T_{\rm I}$&nbsp; and increases the number&nbsp; $I$&nbsp; of subintervals more and more, then the clock time&nbsp; $T$, at which in each case a new binary event&nbsp; ("0" or "1")&nbsp; can occur, becomes smaller and smaller.&nbsp; In the limiting case&nbsp; $T$&nbsp; goes towards zero.  
+
*If one assumes the same observation period&nbsp; $T_{\rm I}$&nbsp; and increases the number&nbsp; $I$&nbsp; of subintervals more and more,&nbsp; then the clock time&nbsp; $T$,&nbsp; at which a new binary event&nbsp; ("0"&nbsp; or&nbsp; "1")&nbsp; can occur,&nbsp; becomes smaller and smaller.&nbsp; In the limiting case&nbsp; $T \to 0$.  
*This means:&nbsp; In the Poisson distribution, the binary events are possible not only at discrete points in time given by a time grid, but at any time.&nbsp; The time diagram below illustrates this fact.  
+
*This means:&nbsp; In the Poisson distribution,&nbsp; '''the binary events are possible'''&nbsp; not only at discrete time  points given by a time grid,&nbsp; but&nbsp; '''at any time'''.&nbsp; The time diagram below illustrates this fact.  
*In order to obtain on average during time&nbsp; $T_{\rm I}$&nbsp; exactly as many "ones" as in the binomial distribution&nbsp; (in the example:&nbsp; six), however, the characteristic probability&nbsp; related to the infinitesimally small time interval&nbsp; $T$&nbsp; must tend to zero&nbsp; $p = {\rm Pr}( e_i = 1)$&nbsp; .  
+
*In order to obtain on average during time&nbsp; $T_{\rm I}$&nbsp; exactly as many&nbsp; "ones"&nbsp; as in the binomial distribution&nbsp; (in the example:&nbsp; six),&nbsp; however,&nbsp; the characteristic probability&nbsp; $p = {\rm Pr}( e_i = 1)$&nbsp; related to the infinitesimally small time interval&nbsp; $T$&nbsp; must tend to zero.  
  
  
Line 96: Line 98:
 
The Poisson distribution is the result of a so-called&nbsp; [https://en.wikipedia.org/wiki/Poisson_point_process Poisson process].&nbsp; Such a process is often used as a model for sequences of events that may occur at random times.&nbsp; Examples of such events include.  
 
The Poisson distribution is the result of a so-called&nbsp; [https://en.wikipedia.org/wiki/Poisson_point_process Poisson process].&nbsp; Such a process is often used as a model for sequences of events that may occur at random times.&nbsp; Examples of such events include.  
 
*the failure of equipment - an important task in reliability theory,  
 
*the failure of equipment - an important task in reliability theory,  
*the shot noise in optical transmission, and  
+
*the shot noise in optical transmission,&nbsp;  and  
 
*the start of telephone calls in a switching center&nbsp; ("teletraffic engineering").  
 
*the start of telephone calls in a switching center&nbsp; ("teletraffic engineering").  
  
  
 
{{GraueBox|TEXT=   
 
{{GraueBox|TEXT=   
$\text{Example 3:}$&nbsp; If ninety switching requests per minute&nbsp; $($also&nbsp; $λ = 1.5 \text{ per second})$&nbsp; are received by a switching center on a long-term average, the probabilities&nbsp; $p_\mu$ that exactly&nbsp; $\mu$&nbsp; occupancies(?) occur in any one-second period are:  
+
$\text{Example 3:}$&nbsp; If ninety switching requests per minute&nbsp; $($&rArr; &nbsp; $λ = 1.5 \text{ per second})$&nbsp; are received by a switching center on a long&ndash;term average,&nbsp; the probabilities&nbsp; $p_\mu$&nbsp; that exactly&nbsp; $\mu$&nbsp; connections occur in any one-second period are:  
 
:$$p_\mu = \frac{1.5^\mu}{\mu!}\cdot {\rm e}^{-1.5}.$$
 
:$$p_\mu = \frac{1.5^\mu}{\mu!}\cdot {\rm e}^{-1.5}.$$
  
This gives the numerical values &nbsp;$p_0 = 0.223$, &nbsp;$p_1 = 0.335$, &nbsp;$p_2 = 0.251$, etc.  
+
This gives the numerical values &nbsp;$p_0 = 0.223$, &nbsp; $p_1 = 0.335$, &nbsp; $p_2 = 0.251$, etc.  
  
From this, further characteristics can be derived:
+
From this,&nbsp; further characteristics can be derived:
*The distance&nbsp; $τ$&nbsp; between two switching desires(?) satisfies the&nbsp; [[Theory_of_Stochastic_Signals/Exponentially_Distributed_Random_Variables#One-sided_exponential_distribution|exponential distribution]].
+
*The distance&nbsp; $τ$&nbsp; between two placement requests satisfies the&nbsp; [[Theory_of_Stochastic_Signals/Exponentially_Distributed_Random_Variables#One-sided_exponential_distribution|exponential distribution]].
*The mean time interval between two switching desires is&nbsp; ${\rm E}[\hspace{0.05cm}τ\hspace{0.05cm}] = 1/λ ≈ 0.667 \rm s$.}}
+
*The mean time interval between two placement requests is&nbsp; ${\rm E}[\hspace{0.05cm}τ\hspace{0.05cm}] = 1/λ ≈ 0.667 \ \rm s$.}}
  
  

Revision as of 15:06, 14 December 2021

Probabilities of the Poisson distribution


$\text{Definition:}$  The  Poisson distribution  is a limiting case of the  binomial distribution,  where

  • on the one hand,  the limit transitions  $I → ∞$  and  $p → 0$  are assumed,
  • additionally,  it is assumed that the product  $I · p = λ$  has a finite value.


The parameter  $λ$  gives the average number of  "ones"  in a fixed unit of time and is called the  rate.


Further,  it should be noted:

  • In contrast to the binomial distribution  $(0 ≤ μ ≤ I)$  here the random quantity can take on arbitrarily large  (integer,  non-negative)  values.
  • This means that the set of possible values here is uncountable.
  • But since no intermediate values can occur,  this is also called a  "discrete distribution".


$\text{Calculation rule:}$ 

$$p_\mu = {\rm Pr} ( z=\mu ) = \lim_{I\to\infty} \cdot \frac{I !}{\mu ! \cdot (I-\mu )!} \cdot (\frac{\lambda}{I} )^\mu \cdot ( 1-\frac{\lambda}{I})^{I-\mu}.$$
  • From this,  after some algebraic transformations,  we obtain:
$$p_\mu = \frac{ \lambda^\mu}{\mu!}\cdot {\rm e}^{-\lambda}.$$


Probabilities of the Poisson distribution
compared to the binomial probabilities

$\text{Example 1:}$  The probabilities

  • of the binomial distribution with  $I =6$,  $p = 0.4$,  and
  • of the Poisson distribution with  $λ = 2.4$


can be seen in the graph on the right.  You can recognize:

  • Both distributions have the same mean  $m_1 = 2.4$.
  • In the Poisson distribution  (red arrows and labels)  the  "outer values"  are more probable than in the binomial distribution.
  • In addition,  random variables  $z > 6$  are also possible with the Poisson distribution,  but their probabilities are also rather small at the chosen rate.


Moments of the Poisson distribution


$\text{Calculation rule:}$ 

$$m_1 =\lim_{\left.{I\hspace{0.05cm}\to\hspace{0.05cm}\infty \atop {p\hspace{0.05cm}\to\hspace{0.05cm} 0} }\right.} I \cdot p= \lambda,$$
$$\sigma =\lim_{\left.{I\hspace{0.05cm}\to\hspace{0.05cm}\infty \atop {p\hspace{0.05cm}\to\hspace{0.05cm} 0} }\right.} \sqrt{I \cdot p \cdot (1-p)} = \sqrt {\lambda}.$$
  • From this it can be seen that in the Poisson distribution the varianc is always  $σ^2 = m_1 = λ$.


Moments of the Poisson distribution

$\text{Example 2:}$ 

As in  $\text{Example 1}$,  here we compare:

  • the binomial distribution with  $I =6$,  $p = 0.4$,  and
  • the Poisson distribution with  $λ = 2.4$.


One can see from the accompanying sketch:

  • Both distributions have exactly the same mean  $m_1 = 2.4$.
  • For the Poisson distribution (marked red in the figure),  the rms value&nbsp (standard deviation)  $σ ≈ 1.55$.
  • In contrast,  for the (blue) binomial distribution,  the rms value is only  $σ = 1.2$.


With the interactive HTML 5/JavaScript applet  "Binomial and Poisson Distribution" 

  • you can determine the probabilities and means (moments) of the Poisson distribution for any  $λ$-values
  • and visualize the similarities and differences compared to the binomial distribution.


Comparison of binomial distribution vs. Poisson distribution


Now both the similarities and the differences between binomial and Poisson distributed random variables shall be worked out again.

Scheme for binomial distribution  (red)  and Poisson distribution  (blue)

The  binomial distribution  is suitable for the description of such stochastic events,  which are characterized by a given clock  $T$.   For example,  for  ISDN  ("Integrated Services Digital Network")  with  $64 \ \rm kbit/s$   ⇒   the clock time  $T \approx 15.6 \ \rm µ s$.

  • Binary events only occur in this time grid.  Such events are,  for example,  error-free  $(e_i = 0)$  or errored  $(e_i = 1)$  transmission of individual symbols.
  • The binomial distribution now allows statistical statements about the number of transmission errors to be expected in a longer time interval  $T_{\rm I} = I ⋅ T$  according to the upper diagram of the graph  (time marked in blue).


Also the  Poisson distribution  makes statements about the number of occurring binary events in a finite time interval:

  • If one assumes the same observation period  $T_{\rm I}$  and increases the number  $I$  of subintervals more and more,  then the clock time  $T$,  at which a new binary event  ("0"  or  "1")  can occur,  becomes smaller and smaller.  In the limiting case  $T \to 0$.
  • This means:  In the Poisson distribution,  the binary events are possible  not only at discrete time points given by a time grid,  but  at any time.  The time diagram below illustrates this fact.
  • In order to obtain on average during time  $T_{\rm I}$  exactly as many  "ones"  as in the binomial distribution  (in the example:  six),  however,  the characteristic probability  $p = {\rm Pr}( e_i = 1)$  related to the infinitesimally small time interval  $T$  must tend to zero.


Applications of the Poisson distribution


The Poisson distribution is the result of a so-called  Poisson process.  Such a process is often used as a model for sequences of events that may occur at random times.  Examples of such events include.

  • the failure of equipment - an important task in reliability theory,
  • the shot noise in optical transmission,  and
  • the start of telephone calls in a switching center  ("teletraffic engineering").


$\text{Example 3:}$  If ninety switching requests per minute  $($⇒   $λ = 1.5 \text{ per second})$  are received by a switching center on a long–term average,  the probabilities  $p_\mu$  that exactly  $\mu$  connections occur in any one-second period are:

$$p_\mu = \frac{1.5^\mu}{\mu!}\cdot {\rm e}^{-1.5}.$$

This gives the numerical values  $p_0 = 0.223$,   $p_1 = 0.335$,   $p_2 = 0.251$, etc.

From this,  further characteristics can be derived:

  • The distance  $τ$  between two placement requests satisfies the  exponential distribution.
  • The mean time interval between two placement requests is  ${\rm E}[\hspace{0.05cm}τ\hspace{0.05cm}] = 1/λ ≈ 0.667 \ \rm s$.


Exercises for the chapter


Aufgabe 2.5: „Binomial” oder „Poisson”?

Aufgabe 2.5Z: Blumenwiese