Difference between revisions of "Theory of Stochastic Signals/Probability Density Function"

From LNTwww
 
(9 intermediate revisions by 2 users not shown)
Line 9: Line 9:
 
== # OVERVIEW OF THE THIRD MAIN CHAPTER # ==
 
== # OVERVIEW OF THE THIRD MAIN CHAPTER # ==
 
<br>
 
<br>
We consider here&nbsp; '''continuous random variables''',&nbsp; i.e.,&nbsp; random variables which can assume infinitely many different values,&nbsp; at least in certain ranges of real numbers.&nbsp;  
+
We consider here&nbsp; &raquo;'''continuous random variables'''&laquo;,&nbsp; i.e.,&nbsp; random variables which can assume infinitely many different values,&nbsp; at least in certain ranges of real numbers.&nbsp;  
*Their applications in information and communication technology are manifold.  
+
*Their applications in information and communication technology are manifold.
 +
 
*They are used,&nbsp; among other things,&nbsp; for the simulation of noise signals and for the description of fading effects.
 
*They are used,&nbsp; among other things,&nbsp; for the simulation of noise signals and for the description of fading effects.
  
  
We restrict ourselves at first to the statistical description of the&nbsp; '''amplitude distribution'''.&nbsp; In detail,&nbsp; the following are treated:
+
We restrict ourselves at first to the statistical description of the&nbsp; &raquo;'''amplitude distribution'''&laquo;.&nbsp; In detail,&nbsp; the following are treated:
  
 
*The relationship between&nbsp; &raquo;probability density function&laquo;&nbsp; $\rm (PDF)$&nbsp; and&nbsp; &raquo;cumulative distribution function&laquo;&nbsp; $\rm (CDF)$;
 
*The relationship between&nbsp; &raquo;probability density function&laquo;&nbsp; $\rm (PDF)$&nbsp; and&nbsp; &raquo;cumulative distribution function&laquo;&nbsp; $\rm (CDF)$;
Line 29: Line 30:
  
  
'''Inner statistical dependencies'''&nbsp; of the underlying processes&nbsp; '''are not considered here'''.&nbsp; For this,&nbsp; we refer to the following main chapters 4 and 5.  
+
&raquo;'''Inner statistical dependencies'''&laquo;&nbsp; of the underlying processes&nbsp; '''are not considered here'''.&nbsp; For this,&nbsp; we refer to the following main chapters&nbsp; $4$&nbsp; and&nbsp; $5$.  
  
  
 
==Properties of continuous random variables==
 
==Properties of continuous random variables==
 +
<br>
 
In the second chapter it was shown that the amplitude distribution of a discrete random variable is completely determined by its&nbsp; $M$&nbsp; occurrence probabilities,&nbsp; where the level number&nbsp; $M$&nbsp; usually has a finite value.  
 
In the second chapter it was shown that the amplitude distribution of a discrete random variable is completely determined by its&nbsp; $M$&nbsp; occurrence probabilities,&nbsp; where the level number&nbsp; $M$&nbsp; usually has a finite value.  
  
 
{{BlaueBox|TEXT=   
 
{{BlaueBox|TEXT=   
$\text{Definition:}$&nbsp; By a&nbsp; '''&raquo;value-continuous random variable&laquo;'''&nbsp; is meant a random variable whose possible numerical values are uncountable &nbsp; &rArr; &nbsp; $M \to \infty$. &nbsp; &nbsp; In the following,&nbsp; we will often use the short form "continuous random variable".}}
+
$\text{Definition:}$&nbsp; By a&nbsp; &raquo;'''value-continuous random variable'''&laquo;&nbsp; is meant a random variable whose possible numerical values are uncountable &nbsp; &rArr; &nbsp; $M \to \infty$.&nbsp; In the following,&nbsp; we will often use the short form&nbsp; &raquo;continuous random variable&laquo;.}}
  
  
 
Further it shall hold:  
 
Further it shall hold:  
*In the following we denote continuous random variables&nbsp; (mostly)&nbsp; with&nbsp; $x$&nbsp; in contrast to the discrete random variables,&nbsp; which are denoted with&nbsp; $z$&nbsp; as before.  
+
#We&nbsp; (mostly)&nbsp; denote value-continuous random variables with&nbsp; $x$&nbsp; in contrast to the value-discrete random variables,&nbsp; which are denoted with&nbsp; $z$&nbsp; as before.  
*No statement is made here about a possible time discretization,&nbsp; i.e.,&nbsp; continuous random variables can be discrete in time.  
+
#No statement is made here about a possible time discretization,&nbsp; i.e.,&nbsp; value-continuous random variables can be discrete in time.  
*Further,&nbsp; we assume for this chapter that there are no statistical bindings between the individual samples&nbsp; $x_ν$,&nbsp; or at least leave them out of consideration.  
+
#We assume for this chapter that there are no statistical bindings between the individual samples&nbsp; $x_ν$,&nbsp; or at least leave them out of consideration.  
  
  
[[File: P_ID41__Sto_T_3_1_S1_neu.png |right|frame| Signal and PDF of a Gaussian noise signal]]
 
 
{{GraueBox|TEXT=
 
{{GraueBox|TEXT=
 
$\text{Example 1:}$
 
$\text{Example 1:}$
 
The graphic shows a section of a stochastic noise signal&nbsp; $x(t)$&nbsp; whose instantaneous value can be taken as a continuous random variable&nbsp; $x$.  
 
The graphic shows a section of a stochastic noise signal&nbsp; $x(t)$&nbsp; whose instantaneous value can be taken as a continuous random variable&nbsp; $x$.  
 +
[[File: P_ID41__Sto_T_3_1_S1_neu.png |right|frame| Signal and PDF of a Gaussian noise signal]]
  
*From the&nbsp; &raquo;probability density function&raquo; &nbsp; $\rm (PDF)$&nbsp; shown on the right,&nbsp; it can be seen that instantaneous values around the mean&nbsp; $m_1$&nbsp; occur most frequently for this example signal.  
+
*From the&nbsp; &raquo;probability density function&raquo; &nbsp; $\rm (PDF)$&nbsp; shown on the right,&nbsp; it can be seen that instantaneous values around the mean&nbsp; $m_1$&nbsp; occur most frequently for any exemplary signal.  
  
  
*Since there are no statistical dependencies between the samples $x_ν$,&nbsp; such a signal is also referred to as&nbsp; &raquo;white noise&laquo;.}}
+
*Since there are no statistical dependencies between the samples $x_ν$,&nbsp; such a signal is often also referred to as&nbsp; &raquo;'''white noise'''&laquo;.}}
  
  
 
==Definition of the probability density function==
 
==Definition of the probability density function==
For a continuous random variable,&nbsp; the probabilities that it takes on quite specific values are zero.&nbsp; Therefore,&nbsp; to describe a continuous random variable,&nbsp; we must always refer to the&nbsp; "probability density function"&nbsp; $\rm (PDF)$.  
+
<br>
 +
For a value-continuous random variable,&nbsp; the probabilities that it takes on quite specific values are zero.&nbsp; Therefore,&nbsp; to describe a value-continuous random variable,&nbsp; we must always refer to the&nbsp; &raquo;probability density function&laquo;&nbsp; $\rm (PDF)$.  
  
 
{{BlaueBox|TEXT=
 
{{BlaueBox|TEXT=
 
$\text{Definition:}$ &nbsp;  
 
$\text{Definition:}$ &nbsp;  
The value of the&nbsp; &raquo;'''probability density function'''&laquo;&nbsp; $f_{x}(x)$&nbsp; at location&nbsp; $x_\mu$&nbsp; is equal to the probability that the instantaneous value of the random variable&nbsp; $x$&nbsp; lies in an&nbsp; (infinitesimally small)&nbsp; interval of width&nbsp; $Δx$&nbsp; around&nbsp; $x_\mu$,&nbsp; divided by&nbsp; $Δx$:
+
The value of the&nbsp; &raquo;'''probability density function'''&laquo;&nbsp; $f_{x}(x)$&nbsp; at location&nbsp; $x_\mu$&nbsp; is equal to the probability that the instantaneous value of the random variable&nbsp; $x$&nbsp; lies in an&nbsp; $($infinitesimally small$)$&nbsp; interval of width&nbsp; $Δx$&nbsp; around&nbsp; $x_\mu$,&nbsp; divided by&nbsp; $Δx$:
 
:$$f_x(x=x_\mu) = \lim_{\rm \Delta \it x \hspace{0.05cm}\to \hspace{0.05cm}\rm 0}\frac{\rm Pr \{\it x_\mu-\rm \Delta \it x/\rm 2 \le \it x \le x_\mu \rm +\rm \Delta \it x/\rm 2\} }{\rm \Delta \it  x}.$$}}
 
:$$f_x(x=x_\mu) = \lim_{\rm \Delta \it x \hspace{0.05cm}\to \hspace{0.05cm}\rm 0}\frac{\rm Pr \{\it x_\mu-\rm \Delta \it x/\rm 2 \le \it x \le x_\mu \rm +\rm \Delta \it x/\rm 2\} }{\rm \Delta \it  x}.$$}}
  
Line 67: Line 70:
 
This extremely important descriptive variable has the following properties:
 
This extremely important descriptive variable has the following properties:
  
*Although from the time course in&nbsp; [[Theory_of_Stochastic_Signals/Probability_Density_Function_(PDF)#Properties_of_continuous_random_variables|$\text{Example 1}$]]&nbsp; it can be seen&nbsp; that the most frequent signal components lie at&nbsp; $x = m_1$&nbsp; and the PDF has its largest value here,&nbsp; for a continuous random variable the probability&nbsp; ${\rm Pr}(x = m_1)$,&nbsp; that the instantaneous value is exactly equal to the mean&nbsp; $m_1$,&nbsp; is identically zero.
+
*Although from the time course in&nbsp; [[Theory_of_Stochastic_Signals/Probability_Density_Function_(PDF)#Properties_of_continuous_random_variables|$\text{Example 1}$]]&nbsp; it can be seen&nbsp; that the most frequent signal components lie at&nbsp; $x = m_1$&nbsp; and the PDF has its largest value here,&nbsp; for a value-continuous random variable the probability&nbsp; ${\rm Pr}(x = m_1)$,&nbsp; that the instantaneous value is exactly equal to the mean&nbsp; $m_1$,&nbsp; is identically zero.
 
 
  
*For the probability that the random variable lies in the range between&nbsp; $x_{\rm u}$&nbsp; and&nbsp; $x_{\rm o}$:
+
*The probability that the random variable lies in the range between&nbsp; $x_{\rm u}$&nbsp; and&nbsp; $x_{\rm o}$:
 
:$${\rm Pr}(x_{\rm u} \le  x \le x_{\rm o})= \int_{x_{\rm u} }^{x_{\rm o} }f_{x}(x) \,{\rm d}x.$$
 
:$${\rm Pr}(x_{\rm u} \le  x \le x_{\rm o})= \int_{x_{\rm u} }^{x_{\rm o} }f_{x}(x) \,{\rm d}x.$$
  
Line 76: Line 78:
 
:$$\int_{-\infty}^{+\infty} f_{x}(x) \,{\rm d}x = \rm 1.$$
 
:$$\int_{-\infty}^{+\infty} f_{x}(x) \,{\rm d}x = \rm 1.$$
  
*The corresponding equation for discrete-value,&nbsp; $M$-level random variables states that the sum over the&nbsp; $M$&nbsp; occurrence probabilities gives the value&nbsp; $1$.  
+
*The corresponding equation for value-discrete,&nbsp; $M$-level random variables states that the sum over the&nbsp; $M$&nbsp; occurrence probabilities gives the value&nbsp; $1$.  
  
  
 
{{BlaueBox|TEXT=
 
{{BlaueBox|TEXT=
 
$\text{Note on nomenclature:}$&nbsp;  
 
$\text{Note on nomenclature:}$&nbsp;  
In the literature,&nbsp; a distinction is often made between the random variable&nbsp; $X$&nbsp; and its realizations&nbsp; $x ∈ X$.
 
  
Thus, the above definition equation is  
+
In the literature,&nbsp; a distinction is often made between the random variable&nbsp; $X$&nbsp; and its realizations&nbsp; $x ∈ X$.&nbsp; Thus,&nbsp; the above definition equation is  
 
:$$f_{X}(X=x) = \lim_{ {\rm \Delta} x \hspace{0.05cm}\to \hspace{0.05cm} 0}\frac{ {\rm Pr} \{ x - {\rm \Delta} x/2 \le X \le x +{\rm \Delta} x/ 2\} }{ {\rm \Delta} x}.$$
 
:$$f_{X}(X=x) = \lim_{ {\rm \Delta} x \hspace{0.05cm}\to \hspace{0.05cm} 0}\frac{ {\rm Pr} \{ x - {\rm \Delta} x/2 \le X \le x +{\rm \Delta} x/ 2\} }{ {\rm \Delta} x}.$$
  
We have largely dispensed with this more precise nomenclature in our learning tutorial&nbsp; $\rm LNTwww$&nbsp; so as not to use up two letters for one quantity.  
+
We have largely dispensed with this more precise nomenclature in our&nbsp; $\rm LNTwww$&nbsp; so as not to use up two letters for one quantity.  
*Lowercase letters&nbsp; $($as&nbsp; $x)$&nbsp; often denote signals and uppercase letters&nbsp; $($as&nbsp; $X)$&nbsp; the associated spectra in our case.  
+
#Lowercase letters&nbsp; $($as&nbsp; $x)$&nbsp; often denote signals and uppercase letters&nbsp; $($as&nbsp; $X)$&nbsp; the associated spectra in our case.  
*Nevertheless,&nbsp; today (2017)&nbsp; we have to honestly admit that the 2001 decision was not entirely happy.}}
+
#Nevertheless,&nbsp; today (2017)&nbsp; we have to honestly admit that the 2001 decision was not entirely happy.}}
  
 
==PDF definition for discrete random variables==
 
==PDF definition for discrete random variables==
For reasons of a uniform representation of all random variables&nbsp; (both discrete-value and continuous-value),&nbsp; it is convenient to define the probability density function also for discrete random variables.  
+
For reasons of a uniform representation of all random variables&nbsp; $($both value-discrete and value-continuous$)$,&nbsp; it is convenient to define the probability density function also for value-discrete random variables.  
  
 
{{BlaueBox|TEXT=
 
{{BlaueBox|TEXT=
 
$\text{Definition:}$ &nbsp;
 
$\text{Definition:}$ &nbsp;
Applying the definition equation of the last section to discrete random variables,&nbsp; the PDF takes infinitely large values at some points&nbsp; $x_\mu$&nbsp; due to the nonvanishingly small probability value and the limit transition&nbsp $Δx → 0$.  
+
Applying the definition equation of the last section to value-discrete random variables,&nbsp; the PDF takes infinitely large values at some points&nbsp; $x_\mu$&nbsp; due to the nonvanishingly small probability value and the limit transition&nbsp; $Δx → 0$.  
  
Thus,&nbsp; the PDF results in a sum of&nbsp; [[Signal_Representation/Direct_Current_Signal_-_Limit_Case_of_a_Periodic_Signal#Dirac_.28delta.29_function_in_frequency_domain|Dirac delta functions]] &nbsp; &rArr; &nbsp; "distributions":  
+
Thus,&nbsp; the PDF results in a sum of&nbsp; [[Signal_Representation/Direct_Current_Signal_-_Limit_Case_of_a_Periodic_Signal#Dirac_.28delta.29_function_in_frequency_domain|&raquo;Dirac delta functions&laquo;]] &nbsp; &rArr; &nbsp; &raquo;distributions&laquo;:  
 
:$$f_{x}(x)=\sum_{\mu=1}^{M}p_\mu\cdot {\rm \delta}( x-x_\mu).$$
 
:$$f_{x}(x)=\sum_{\mu=1}^{M}p_\mu\cdot {\rm \delta}( x-x_\mu).$$
  
Line 103: Line 104:
  
  
Here is another note to help classify the different descriptive quantities for discrete and continuous random variables: &nbsp; Probability and probability density function are related in a similar way as in the book&nbsp; [[Signal Representation]]
+
Here is another note to help classify the different descriptive quantities for value-discrete and value-continuous random variables: &nbsp; Probability and probability density function are related in a similar way as in the book&nbsp; [[Signal_Representation|&raquo;Signal Representation&laquo;]]
*a discrete spectral component of a harmonic oscillation ⇒ line spectrum,&nbsp; and  
+
*a discrete spectral component of a harmonic oscillation ⇒ line spectrum,&nbsp; and
*a continuous spectrum of an energy-limited&nbsp; (pulse-shaped)&nbsp; signal.
+
 +
*a continuous spectrum of an energy-limited&nbsp; $($pulse-shaped$)$&nbsp; signal.
  
  
 +
{{GraueBox|TEXT=
 
[[File:P_ID40__Sto_T_3_1_S3_NEU.png|right|frame|Signal and PDF of a ternary signal]]
 
[[File:P_ID40__Sto_T_3_1_S3_NEU.png|right|frame|Signal and PDF of a ternary signal]]
{{GraueBox|TEXT=
+
$\text{Example 2:}$&nbsp; Below you can see a section
$\text{Example 2:}$&nbsp; Below is a section  
+
 
*of a rectangular signal with three possible values,  
 
*of a rectangular signal with three possible values,  
*where the signal value&nbsp; $0 \ \rm V$&nbsp occurs twice as often as the outer signal values&nbsp; $(\pm 1 \ \rm V)$.
 
  
 +
*where the signal value&nbsp; $0 \ \rm V$&nbsp; occurs twice as often as the outer signal values&nbsp; $(\pm 1 \ \rm V)$.
  
  
Thus,&nbsp; the corresponding&nbsp; PDF&nbsp; (values from top to bottom)&nbsp; is:
 
:$$f_{x}(x) = 0.25 \cdot \delta(x - {\rm 1 V})+ 0.5\cdot \delta(x) + 0.25\cdot \delta (x + 1\rm V).$$}}
 
  
 +
Thus,&nbsp; the corresponding&nbsp; PDF&nbsp; $($values from top to bottom$)$:
 +
:$$f_{x}(x) = 0.25 \cdot \delta(x - {\rm 1 V})+ 0.5\cdot \delta(x) + 0.25\cdot \delta (x + 1\rm V).$$
  
For a more in-depth look at the topic covered here,&nbsp; we recommend the following&nbsp; (German language)&nbsp; learning video:
+
&rArr; &nbsp; For a more in-depth look at the topic covered here,&nbsp; we recommend the following&nbsp; $($German language$)$&nbsp; learning video:
  
:[[Wahrscheinlichkeit_und_WDF_(Lernvideo)|Wahrscheinlichkeit und WDF]] &nbsp; &rArr; &nbsp; "Probability and probability density function"
+
:[[Wahrscheinlichkeit_und_WDF_(Lernvideo)|&raquo;Wahrscheinlichkeit und WDF&laquo;]] &nbsp; &rArr; &nbsp; &raquo;Probability and probability density function&laquo;}}
  
 
==Numerical determination of the PDF==
 
==Numerical determination of the PDF==
You can see here a scheme for the numerical determination of the probability density function:&nbsp;  
+
<br>
 +
You can see here a scheme for the numerical determination of the probability density function.&nbsp; Assuming that the random variable&nbsp; $x$&nbsp; at hand has negligible values outside the range from&nbsp; $x_{\rm min} = -4.02$&nbsp; to&nbsp; $x_{\rm max} = +4.02$,&nbsp; proceed as follows:
 +
[[File:EN_Sto_T_3_1_S4_neu.png |right|frame| For numerical determination of the PDF]]
  
Assuming that the random variable&nbsp; $x$&nbsp; at hand has negligible values outside the range from&nbsp; $x_{\rm min} = -4.02$&nbsp; to&nbsp; $x_{\rm max} = +4.02$,&nbsp; proceed as follows:
+
 
[[File:P_ID175__Sto_T_3_1_S4_ganzneu.png |right|frame| For numerical determination of the PDF '''Korrektur''']]
+
#Divide the range of&nbsp; $x$-values into&nbsp; $I$&nbsp; intervals of equal width&nbsp; $Δx$&nbsp; and define a field&nbsp; PDF$[0 : I-1]$.&nbsp; In the sketch&nbsp; $I = 201$&nbsp; and accordingly&nbsp; $Δx = 0.04$&nbsp; is chosen.  
#Divide the range of&nbsp; $x$-values into&nbsp; $I$&nbsp; intervals of equal width&nbsp; $Δx$&nbsp; and define a field&nbsp; $\text{PDF}[0 : I-1]$.&nbsp; In the sketch&nbsp; $I = 201$&nbsp; and accordingly&nbsp; $Δx = 0.04$&nbsp; is chosen.  
 
 
#The random variable&nbsp; $x$&nbsp; is now called&nbsp; $N$&nbsp; times in succession,&nbsp; each time checking to which interval&nbsp; $i_{\rm act}$&nbsp; the current random variable&nbsp; $x_{\rm act}$ belongs: <br> &nbsp; &nbsp; $i_{\rm act} = ({\rm int})((x + x_{\rm max})/Δx).$  
 
#The random variable&nbsp; $x$&nbsp; is now called&nbsp; $N$&nbsp; times in succession,&nbsp; each time checking to which interval&nbsp; $i_{\rm act}$&nbsp; the current random variable&nbsp; $x_{\rm act}$ belongs: <br> &nbsp; &nbsp; $i_{\rm act} = ({\rm int})((x + x_{\rm max})/Δx).$  
 
#The corresponding field element PDF( $i_{\rm act}$) is then incremented by&nbsp; $1$.&nbsp;  
 
#The corresponding field element PDF( $i_{\rm act}$) is then incremented by&nbsp; $1$.&nbsp;  
#After $N$ iterations, $\text{PDF}[i_{\rm act}]$ then contains the number of random numbers belonging to the interval $i_{\rm act}$.  
+
#After $N$ iterations, PDF$[i_{\rm act}]$ then contains the number of random numbers belonging to the interval $i_{\rm act}$.  
#The actual PDF values are obtained if,&nbsp; at the end,&nbsp; all field elements&nbsp; $\text{PDF}[i]$&nbsp; with&nbsp; $0 ≤ i ≤ I-1$&nbsp; are still divided by&nbsp; $N \cdot Δx$.  
+
#The actual PDF values are obtained if,&nbsp; at the end,&nbsp; all field elements&nbsp; PDF$[i]$&nbsp; with&nbsp; $0 ≤ i ≤ I-1$&nbsp; are still divided by&nbsp; $N \cdot Δx$.  
 
<br clear=all>
 
<br clear=all>
 
{{GraueBox|TEXT=
 
{{GraueBox|TEXT=
Line 139: Line 143:
 
From the drawn green arrows in the graph above,&nbsp;  one can see:  
 
From the drawn green arrows in the graph above,&nbsp;  one can see:  
 
*The value&nbsp; $x_{\rm act} = 0.07$&nbsp; leads to the result&nbsp; $i_{\rm act} =$ (int) ((0.07 + 4.02)/0.04) = (int) $102.25$.
 
*The value&nbsp; $x_{\rm act} = 0.07$&nbsp; leads to the result&nbsp; $i_{\rm act} =$ (int) ((0.07 + 4.02)/0.04) = (int) $102.25$.
* Here&nbsp; "(int)"&nbsp; means an integer conversion after float division &nbsp; ⇒ &nbsp; $i_{\rm act} = 102$.  
+
* Here&nbsp; &raquo;(int)&laquo;&nbsp; means an integer conversion after float division &nbsp; ⇒ &nbsp; $i_{\rm act} = 102$.  
 
*The same interval&nbsp; $i_{\rm act} = 102$&nbsp; results for&nbsp; $0.06 < x_{\rm act} < 0.10$,&nbsp; so for example also for&nbsp; $x_{\rm act} = 0.09$. }}
 
*The same interval&nbsp; $i_{\rm act} = 102$&nbsp; results for&nbsp; $0.06 < x_{\rm act} < 0.10$,&nbsp; so for example also for&nbsp; $x_{\rm act} = 0.09$. }}
  
 
==Exercises for the chapter==
 
==Exercises for the chapter==
 
+
<br>
[[Aufgaben:Exercise_3.1:_cos²-PDF_and_PDF_with_Dirac_Functions|Exercise 3.1: cos²-PDF and PDF with Dirac Functions]]
+
[[Aufgaben:Exercise_3.1:_cos²-PDF_and_PDF_with_Dirac_Functions|Exercise 3.1: Cosine-square PDF and PDF with Dirac Functions]]
  
 
[[Aufgaben:Exercise_3.1Z:_Triangular_PDF|Exercise 3.1Z: Triangular PDF]]
 
[[Aufgaben:Exercise_3.1Z:_Triangular_PDF|Exercise 3.1Z: Triangular PDF]]

Latest revision as of 17:26, 14 February 2024


# OVERVIEW OF THE THIRD MAIN CHAPTER #


We consider here  »continuous random variables«,  i.e.,  random variables which can assume infinitely many different values,  at least in certain ranges of real numbers. 

  • Their applications in information and communication technology are manifold.
  • They are used,  among other things,  for the simulation of noise signals and for the description of fading effects.


We restrict ourselves at first to the statistical description of the  »amplitude distribution«.  In detail,  the following are treated:

  • The relationship between  »probability density function«  $\rm (PDF)$  and  »cumulative distribution function«  $\rm (CDF)$;
  • the calculation of  »expected values  and  moments«;
  • some  »special cases«  of continuous-value distributions:
  1. uniform distributed random variables, 
  2. Gaussian distributed random variables, 
  3. exponential distributed random variables, 
  4. Laplace distributed random variables, 
  5. Rayleigh distributed random variables, 
  6. Rice distributed random variables, 
  7. Cauchy distributed random variables;
  • the  »generation of continuous random variables«  on a computer.


»Inner statistical dependencies«  of the underlying processes  are not considered here.  For this,  we refer to the following main chapters  $4$  and  $5$.


Properties of continuous random variables


In the second chapter it was shown that the amplitude distribution of a discrete random variable is completely determined by its  $M$  occurrence probabilities,  where the level number  $M$  usually has a finite value.

$\text{Definition:}$  By a  »value-continuous random variable«  is meant a random variable whose possible numerical values are uncountable   ⇒   $M \to \infty$.  In the following,  we will often use the short form  »continuous random variable«.


Further it shall hold:

  1. We  (mostly)  denote value-continuous random variables with  $x$  in contrast to the value-discrete random variables,  which are denoted with  $z$  as before.
  2. No statement is made here about a possible time discretization,  i.e.,  value-continuous random variables can be discrete in time.
  3. We assume for this chapter that there are no statistical bindings between the individual samples  $x_ν$,  or at least leave them out of consideration.


$\text{Example 1:}$ The graphic shows a section of a stochastic noise signal  $x(t)$  whose instantaneous value can be taken as a continuous random variable  $x$.

Signal and PDF of a Gaussian noise signal
  • From the  »probability density function»   $\rm (PDF)$  shown on the right,  it can be seen that instantaneous values around the mean  $m_1$  occur most frequently for any exemplary signal.


  • Since there are no statistical dependencies between the samples $x_ν$,  such a signal is often also referred to as  »white noise«.


Definition of the probability density function


For a value-continuous random variable,  the probabilities that it takes on quite specific values are zero.  Therefore,  to describe a value-continuous random variable,  we must always refer to the  »probability density function«  $\rm (PDF)$.

$\text{Definition:}$   The value of the  »probability density function«  $f_{x}(x)$  at location  $x_\mu$  is equal to the probability that the instantaneous value of the random variable  $x$  lies in an  $($infinitesimally small$)$  interval of width  $Δx$  around  $x_\mu$,  divided by  $Δx$:

$$f_x(x=x_\mu) = \lim_{\rm \Delta \it x \hspace{0.05cm}\to \hspace{0.05cm}\rm 0}\frac{\rm Pr \{\it x_\mu-\rm \Delta \it x/\rm 2 \le \it x \le x_\mu \rm +\rm \Delta \it x/\rm 2\} }{\rm \Delta \it x}.$$


This extremely important descriptive variable has the following properties:

  • Although from the time course in  $\text{Example 1}$  it can be seen  that the most frequent signal components lie at  $x = m_1$  and the PDF has its largest value here,  for a value-continuous random variable the probability  ${\rm Pr}(x = m_1)$,  that the instantaneous value is exactly equal to the mean  $m_1$,  is identically zero.
  • The probability that the random variable lies in the range between  $x_{\rm u}$  and  $x_{\rm o}$:
$${\rm Pr}(x_{\rm u} \le x \le x_{\rm o})= \int_{x_{\rm u} }^{x_{\rm o} }f_{x}(x) \,{\rm d}x.$$
  • As an important normalization property,  this yields for the area under the PDF with the boundary transitions  $x_{\rm u} → \hspace{0.05cm} - \hspace{0.05cm} ∞$  and  $x_{\rm o} → +∞:$
$$\int_{-\infty}^{+\infty} f_{x}(x) \,{\rm d}x = \rm 1.$$
  • The corresponding equation for value-discrete,  $M$-level random variables states that the sum over the  $M$  occurrence probabilities gives the value  $1$.


$\text{Note on nomenclature:}$ 

In the literature,  a distinction is often made between the random variable  $X$  and its realizations  $x ∈ X$.  Thus,  the above definition equation is

$$f_{X}(X=x) = \lim_{ {\rm \Delta} x \hspace{0.05cm}\to \hspace{0.05cm} 0}\frac{ {\rm Pr} \{ x - {\rm \Delta} x/2 \le X \le x +{\rm \Delta} x/ 2\} }{ {\rm \Delta} x}.$$

We have largely dispensed with this more precise nomenclature in our  $\rm LNTwww$  so as not to use up two letters for one quantity.

  1. Lowercase letters  $($as  $x)$  often denote signals and uppercase letters  $($as  $X)$  the associated spectra in our case.
  2. Nevertheless,  today (2017)  we have to honestly admit that the 2001 decision was not entirely happy.

PDF definition for discrete random variables

For reasons of a uniform representation of all random variables  $($both value-discrete and value-continuous$)$,  it is convenient to define the probability density function also for value-discrete random variables.

$\text{Definition:}$   Applying the definition equation of the last section to value-discrete random variables,  the PDF takes infinitely large values at some points  $x_\mu$  due to the nonvanishingly small probability value and the limit transition  $Δx → 0$.

Thus,  the PDF results in a sum of  »Dirac delta functions«   ⇒   »distributions«:

$$f_{x}(x)=\sum_{\mu=1}^{M}p_\mu\cdot {\rm \delta}( x-x_\mu).$$

The weights of these Dirac delta functions are equal to the probabilities  $p_\mu = {\rm Pr}(x = x_\mu$).


Here is another note to help classify the different descriptive quantities for value-discrete and value-continuous random variables:   Probability and probability density function are related in a similar way as in the book  »Signal Representation«

  • a discrete spectral component of a harmonic oscillation ⇒ line spectrum,  and
  • a continuous spectrum of an energy-limited  $($pulse-shaped$)$  signal.


Signal and PDF of a ternary signal

$\text{Example 2:}$  Below you can see a section

  • of a rectangular signal with three possible values,
  • where the signal value  $0 \ \rm V$  occurs twice as often as the outer signal values  $(\pm 1 \ \rm V)$.


Thus,  the corresponding  PDF  $($values from top to bottom$)$:

$$f_{x}(x) = 0.25 \cdot \delta(x - {\rm 1 V})+ 0.5\cdot \delta(x) + 0.25\cdot \delta (x + 1\rm V).$$

⇒   For a more in-depth look at the topic covered here,  we recommend the following  $($German language$)$  learning video:

»Wahrscheinlichkeit und WDF«   ⇒   »Probability and probability density function«

Numerical determination of the PDF


You can see here a scheme for the numerical determination of the probability density function.  Assuming that the random variable  $x$  at hand has negligible values outside the range from  $x_{\rm min} = -4.02$  to  $x_{\rm max} = +4.02$,  proceed as follows:

For numerical determination of the PDF


  1. Divide the range of  $x$-values into  $I$  intervals of equal width  $Δx$  and define a field  PDF$[0 : I-1]$.  In the sketch  $I = 201$  and accordingly  $Δx = 0.04$  is chosen.
  2. The random variable  $x$  is now called  $N$  times in succession,  each time checking to which interval  $i_{\rm act}$  the current random variable  $x_{\rm act}$ belongs:
        $i_{\rm act} = ({\rm int})((x + x_{\rm max})/Δx).$
  3. The corresponding field element PDF( $i_{\rm act}$) is then incremented by  $1$. 
  4. After $N$ iterations, PDF$[i_{\rm act}]$ then contains the number of random numbers belonging to the interval $i_{\rm act}$.
  5. The actual PDF values are obtained if,  at the end,  all field elements  PDF$[i]$  with  $0 ≤ i ≤ I-1$  are still divided by  $N \cdot Δx$.


$\text{Example 3:}$  From the drawn green arrows in the graph above,  one can see:

  • The value  $x_{\rm act} = 0.07$  leads to the result  $i_{\rm act} =$ (int) ((0.07 + 4.02)/0.04) = (int) $102.25$.
  • Here  »(int)«  means an integer conversion after float division   ⇒   $i_{\rm act} = 102$.
  • The same interval  $i_{\rm act} = 102$  results for  $0.06 < x_{\rm act} < 0.10$,  so for example also for  $x_{\rm act} = 0.09$.

Exercises for the chapter


Exercise 3.1: Cosine-square PDF and PDF with Dirac Functions

Exercise 3.1Z: Triangular PDF