Loading [MathJax]/jax/output/HTML-CSS/fonts/TeX/fontdata.js

Difference between revisions of "Aufgaben:Exercise 3.2Z: Two-dimensional Probability Mass Function"

From LNTwww
 
(19 intermediate revisions by 4 users not shown)
Line 1: Line 1:
  
{{quiz-Header|Buchseite=Informationstheorie/Einige Vorbemerkungen zu zweidimensionalen Zufallsgrößen
+
{{quiz-Header|Buchseite=Information_Theory/Some_Preliminary_Remarks_on_Two-Dimensional_Random_Variables
 
}}
 
}}
  
[[File:P_ID2752__Inf_Z_3_2_neu.png|right|2D–Wahrscheinlichkeitsfunktionen der Zufallsgrößen <i>X</i> und <i>Y</i>]]
+
[[File:P_ID2752__Inf_Z_3_2_neu.png|right|frame|PMF of the two-dimensional  random variable&nbsp; XY]]
Wir betrachten die Zufallsgrößen X={0,1,2,3} und Y={0,1,2}, deren gemeinsame Wahrscheinlichkeitsfunktion $P_{X,Y}(X,Y)$ gegeben ist.  
+
We consider the random variables&nbsp; $X =  \{ 0,\ 1,\ 2,\ 3 \}$&nbsp; and&nbsp; $Y =  \{ 0,\ 1,\ 2 \}$, whose joint probability mass function&nbsp; $P_{XY}(X,\ Y)$&nbsp; is given.
*Aus dieser 2D–Wahrscheinlichkeitsfunktion sollen die eindimensionalen Wahrscheinlichkeitsfunktionen PX(X) und PY(Y) ermittelt werden.
+
*From this two-dimensional  probability mass function&nbsp; (PMF),&nbsp; the one-dimensional probability mass functions&nbsp; PX(X)&nbsp; and&nbsp; PY(Y)&nbsp; are to be determined.
*Man nennt eine solche manchmal auch Randwahrscheinlichkeit (englisch: ''Marginal Probability'').
+
*Such a one-dimensional probability mass function is sometimes also called&nbsp; "marginal probability".
  
Gilt  PXY(X,Y)=PX(X)PY(Y), so sind die beiden Zufallsgrößen X und Y statistisch unabhängig. Andernfalls bestehen statistische Bindungen zwischen X und Y.
 
  
Im zweiten Teil der Aufgabe betrachten wir die Zufallsgrößen U={0,1} und V={0,1}, die sich aus X und Y durch Modulo–2–Operationen ergeben:
+
If &nbsp;PXY(X, Y)=PX(X)PY(Y), the two random variables&nbsp; X&nbsp; and&nbsp; Y&nbsp; are statistically independent.&nbsp; Otherwise, there are statistical dependencies between them.
 +
 
 +
In the second part of the task we consider the random variables&nbsp; $U=  \big \{ 0,\ 1 \big \}$&nbsp; and&nbsp; $V= \big \{ 0,\ 1 \big \}$,&nbsp; which result from&nbsp; X&nbsp; and&nbsp; Y&nbsp; by modulo-2 operations:
  
 
:U=Xmod2,V=Ymod2.
 
:U=Xmod2,V=Ymod2.
  
  
''Hinweise:''
 
*Die Aufgabe gehört zum  Kapitel [[Informationstheorie/Einige_Vorbemerkungen_zu_zweidimensionalen_Zufallsgrößen|Einige Vorbemerkungen zu den 2D-Zufallsgrößen]].
 
*Ausgegangen wird hier von der gleichen Konstellation wie in [[http://en.lntwww.de/Aufgaben:3.02_Erwartungswertberechnungen|Aufgabe 3.2]].
 
*Dort wurde die Zufallsgrößen  Y={0,1,2,3}  betrachtet, allerdings mit dem Zusatz Pr(Y=3)=0.
 
*Die so erzwungene Eigenschaft |X|=|Y|  war in der vorherigen Aufgabe zur formalen Berechnung des Erwartungswertes E[PX(X)] von Vorteil.
 
*Sollte die Eingabe des Zahlenwertes &bdquo;0&rdquo; erforderlich sein, so geben Sie bitte &bdquo;0.&rdquo; ein.
 
  
  
===Fragebogen===
+
 
 +
 
 +
 
 +
 
 +
<u>Hints:</u>
 +
*The exercise belongs to the chapter&nbsp; [[Information_Theory/Einige_Vorbemerkungen_zu_zweidimensionalen_Zufallsgrößen|Some preliminary remarks on two-dimensional random variables]].
 +
*The same constellation is assumed here as in&nbsp; [[Aufgaben:Aufgabe_3.2:_Erwartungswertberechnungen|Exercise 3.2]].
 +
*There the random variables&nbsp;  $Y = \{ 0,\ 1,\ 2,\ 3 \}&nbsp;  were considered, but with the addition&nbsp;{\rm Pr}(Y = 3) = 0$.
 +
*The property&nbsp; |X|=|Y|&nbsp; forced in this way was advantageous in the previous task for the formal calculation of the expected value.
 +
 +
 
 +
 
 +
===Questions===
  
 
<quiz display=simple>
 
<quiz display=simple>
  
{Wie lautet die Wahrscheinlichkeitsfunktion PX(X)?
+
{What is the probability mass function&nbsp; PX(X)?
 
|type="{}"}
 
|type="{}"}
 
PX(0) =  { 0.5 3% }
 
PX(0) =  { 0.5 3% }
 
PX(1) =   { 0.125 3% }
 
PX(1) =   { 0.125 3% }
PX(2) =  { 0 3% }
+
PX(2) =  { 0. }
 
PX(3) = { 0.375 3% }
 
PX(3) = { 0.375 3% }
  
{Wie lautet die Wahrscheinlichkeitsfunktion PY(Y)?
+
{What is the probability mass function&nbsp; PY(Y)?
 
|type="{}"}
 
|type="{}"}
 
PY(0) =  { 0.5 3% }
 
PY(0) =  { 0.5 3% }
Line 40: Line 47:
 
PY(2) =  { 0.25 3% }
 
PY(2) =  { 0.25 3% }
  
{Sind die Zufallsgrößen X und Y statistisch unabhängig?
+
{Are the random variables&nbsp; X&nbsp; and&nbsp; Y&nbsp; statistically independent?
|type="[]"}
 
- Ja,
 
+ Nein.
 
  
 +
|type="()"}
 +
- Yes,
 +
+ No.
  
{Ermitteln Sie die Wahrscheinlichkeiten PUV(U,V).
+
 
 +
{Determine the probabilities&nbsp; $P_{UV}( U,\ V)$.
 
|type="{}"}
 
|type="{}"}
PUV(U=0,V=0) =  { 0.375 3% }
+
$P_{UV}( U = 0,\ V = 0) \ = \ $ { 0.375 3% }
PUV(U=0,V=1) =  { 0.375 3% }
+
$P_{UV}( U = 0,\ V = 1) \ = \ $ { 0.375 3% }
PUV(U=1,V=0) =  { 0.125 3% }
+
$P_{UV}( U = 1,\ V = 0) \ = \ $ { 0.125 3% }
PUV(U=1,V=1) =  { 0.125 3% }
+
$P_{UV}( U =1,\ V = 1) \ = \ $ { 0.125 3% }
  
{Sind die Zufallsgrößen U und V statistisch unabhängig?
+
{Are the random variables&nbsp; U&nbsp; and&nbsp; V&nbsp; statistically independent?
|type="[]"}
+
|type="()"}
+ Ja,
+
+ Yes,
- Nein.
+
- No.
  
  
Line 62: Line 70:
 
</quiz>
 
</quiz>
  
===Musterlösung===
+
===Solution===
 
{{ML-Kopf}}
 
{{ML-Kopf}}
'''1.''' Man kommt von PXY(X,Y) zur 1D–Wahrscheinlichkeitsfunktion PX(X) ,indem man alle Y-Wahrscheinlichkeiten aufsummiert:
+
'''(1)'''&nbsp; You get from&nbsp; $P_{XY}(X,\ Y)$&nbsp; to the one-dimensional probability mass function&nbsp; PX(X) by summing up all&nbsp; Y probabilities:
 
+
:$$P_X(X = x_{\mu}) = \sum_{y \hspace{0.05cm} \in \hspace{0.05cm} Y} \hspace{0.1cm} P_{XY}(x_{\mu}, y).$$
PX(X=xμ)=yYPXY(xμ,y)
+
*One thus obtains the following numerical values:
 
+
:$$P_X(X = 0) = 1/4+1/8+1/8 = 1/2 \hspace{0.15cm}\underline{= 0.500},$$
 
+
:$$P_X(X = 1)= 0+0+1/8 =  1/8 \hspace{0.15cm}\underline{= 0.125},$$
$$\Rightarrow  P_X(X = 0) = 1/4+1/8+1/8 = 1/2 = 0.500$$
+
:$$P_X(X = 2) =  0+0+0 \hspace{0.15cm}\underline{= 0}$$
 
+
:$$P_X(X = 3) = 1/4+1/8+0=3/8 \hspace{0.15cm}\underline{= 0.375}\hspace{0.5cm} \Rightarrow  \hspace{0.5cm}    P_X(X) = \big [ 1/2, \ 1/8 , \ 0 , \ 3/8 \big ].$$
PX(X=1)=0+0+1/8=1/8=0.125
 
 
 
PX(X=2)=0+0+0=0
 
  
PX(X=3)=1/4+1/8+0=3/8=0.375
 
  
PX(X)=[1/2,1/8,0,3/8]
 
  
'''2.''' Analog zur Teilaufgabe (a) gilt nun:  
+
'''(2)'''&nbsp; Analogous to sub-task &nbsp; '''(1)'''&nbsp;, the following now holds:
 +
:PY(Y=yκ)=xXPXY(x,yκ)
 +
:PY(Y=0)=1/4+0+0+1/4=1/2=0.500_,
 +
:$$P_Y(Y = 1) = 1/8+0+0+1/8 = 1/4  \hspace{0.15cm}\underline{= 0.250},$$
 +
:PY(Y=2)=1/8+1/8+0+0=1/4=0.250_PY(Y=0)=[1/2, 1/4, 1/4].
  
PY(Y=yκ)=xXPXY(x,yκ)
 
  
PY(Y=0)=1/4+0+0+1/4=1/2=0.500
 
  
$P_Y(Y = 1) = 1/8+0+0+1/8 = 1/4 = 0.250$
+
'''(3)'''&nbsp; With statistical independence,&nbsp;  $P_{XY}(X,Y)= P_X(X) \cdot P_Y(Y)$&nbsp; should be.
 +
*This does not apply here: &nbsp; &nbsp;  answer &nbsp; <u>'''NO'''</u>.
  
PY(Y=2)=1/8+1/8+0+0=1/4=0.250
 
  
PY(Y=0)=[1/2,1/4,1/4]
 
  
'''3.''' Bei Unabhängigkeit sollte $P_{XY}(X,Y)= P_X(X) . P_Y(Y)$ sein.Dies trifft hier nicht zu $\Rightarrow$ Antwort Nein.
+
'''(4)'''&nbsp; Starting from the left-hand table &nbsp; &rArr; &nbsp; $P_{XY}(X,Y),&nbsp;  we arrive at the middle table &nbsp; &rArr; &nbsp;P_{UY}(U,Y)$, <br>by combining certain probabilities according to&nbsp; $U = X \hspace{0.1cm}\text{mod} \hspace{0.1cm} 2$.  
  
'''4.'''  Ausgehend von $P_{XY}(X,Y)\RightarrowlinkeTabellekommtmanzuP_{UY}(U,Y)\RightarrowmittlereTabelle,indemmangewisseWahrscheinlichkeitenentsprechendU = X$ zusammenfasst. Berücksichtigt man noch $V = Y mod 2$, so erhält man die gesuchten Wahrscheinlichkeiten entsprechend der rechten Tabelle:  
+
If one also takes into account&nbsp; $V = Y \hspace{0.1cm}\text{mod} \hspace{0.1cm} 2$, one obtains the probabilities sought according to the right-hand table:
 +
[[File:P_ID2753__Inf_Z_3_2d_neu.png|right|frame|Different probability functions]]
  
PUV(U=0,V=0)=PUV(U=0,V=1)=3/8=0.375
+
:$$P_{UV}( U = 0, V = 0) = 3/8 \hspace{0.15cm}\underline{=  0.375},$$
 +
:$$P_{UV}( U = 0, V = 1) = 3/8 \hspace{0.15cm}\underline{= 0.375},$$
 +
:PUV(U=1,V=0)=1/8=0.125_,
 +
:$$P_{UV}( U = 1, V = 1) = 1/8 \hspace{0.15cm}\underline{=  0.125}.$$
  
PUV(U=1,V=0)=PUV(U=1,V=1)=1/8=0.125
 
  
[[File:P_ID2753__Inf_Z_3_2d_neu.png|P_ID2753__Inf_Z_3_2d_neu.png]]
 
  
'''5.'''Die 1D–Wahrscheinlichkeitsfunktionen lauten nun:
+
'''(5)'''&nbsp; The correct answer is &nbsp; <u>'''YES'''</u>:
 +
*The corresponding one-dimensional probability mass functions are:  &nbsp;
 +
:PU(U)=[1/2, 1/2],
 +
:PV(V)=[3/4, 1/4]. 
 +
*Thus:&nbsp; PUV(U,V)=PU(U)PV(V)  &nbsp; &rArr;  &nbsp;  U&nbsp; and&nbsp; V&nbsp; are statistically independent.
  
PU(U)=[1/2,1/2]PV(V)=[3/4,1/4]
 
  
Damit gilt PUV(U,V)=PU(U).PV(V)   U und V  sind statistisch unabhängig 
 
Antwort Ja.
 
  
  
Line 118: Line 125:
  
  
[[Category:Aufgaben zu Informationstheorie|^3.1 Vorbemerkungen zu 2D-Zufallsgrößen^]]
+
[[Category:Information Theory: Exercises|^3.1 General Information on 2D Random Variables^]]

Latest revision as of 10:12, 24 September 2021

PMF of the two-dimensional random variable  XY

We consider the random variables  X={0, 1, 2, 3}  and  Y={0, 1, 2}, whose joint probability mass function  PXY(X, Y)  is given.

  • From this two-dimensional probability mass function  (PMF),  the one-dimensional probability mass functions  PX(X)  and  PY(Y)  are to be determined.
  • Such a one-dimensional probability mass function is sometimes also called  "marginal probability".


If  PXY(X, Y)=PX(X)PY(Y), the two random variables  X  and  Y  are statistically independent.  Otherwise, there are statistical dependencies between them.

In the second part of the task we consider the random variables  U={0, 1}  and  V={0, 1},  which result from  X  and  Y  by modulo-2 operations:

U=Xmod2,V=Ymod2.





Hints:

  • The exercise belongs to the chapter  Some preliminary remarks on two-dimensional random variables.
  • The same constellation is assumed here as in  Exercise 3.2.
  • There the random variables  Y={0, 1, 2, 3}  were considered, but with the addition  Pr(Y=3)=0.
  • The property  |X|=|Y|  forced in this way was advantageous in the previous task for the formal calculation of the expected value.


Questions

1

What is the probability mass function  PX(X)?

PX(0) = 

PX(1) = 

PX(2) = 

PX(3) = 

2

What is the probability mass function  PY(Y)?

PY(0) = 

PY(1) = 

PY(2) = 

3

Are the random variables  X  and  Y  statistically independent?

Yes,
No.

4

Determine the probabilities  PUV(U, V).

PUV(U=0, V=0) = 

PUV(U=0, V=1) = 

PUV(U=1, V=0) = 

PUV(U=1, V=1) = 

5

Are the random variables  U  and  V  statistically independent?

Yes,
No.


Solution

(1)  You get from  PXY(X, Y)  to the one-dimensional probability mass function  PX(X) by summing up all  Y probabilities:

PX(X=xμ)=yYPXY(xμ,y).
  • One thus obtains the following numerical values:
PX(X=0)=1/4+1/8+1/8=1/2=0.500_,
PX(X=1)=0+0+1/8=1/8=0.125_,
PX(X=2)=0+0+0=0_
PX(X=3)=1/4+1/8+0=3/8=0.375_PX(X)=[1/2, 1/8, 0, 3/8].


(2)  Analogous to sub-task   (1) , the following now holds:

PY(Y=yκ)=xXPXY(x,yκ)
PY(Y=0)=1/4+0+0+1/4=1/2=0.500_,
PY(Y=1)=1/8+0+0+1/8=1/4=0.250_,
PY(Y=2)=1/8+1/8+0+0=1/4=0.250_PY(Y=0)=[1/2, 1/4, 1/4].


(3)  With statistical independence,  PXY(X,Y)=PX(X)PY(Y)  should be.

  • This does not apply here:     answer   NO.


(4)  Starting from the left-hand table   ⇒   PXY(X,Y),  we arrive at the middle table   ⇒   PUY(U,Y),
by combining certain probabilities according to  U=Xmod2.

If one also takes into account  V=Ymod2, one obtains the probabilities sought according to the right-hand table:

Different probability functions
PUV(U=0,V=0)=3/8=0.375_,
PUV(U=0,V=1)=3/8=0.375_,
PUV(U=1,V=0)=1/8=0.125_,
PUV(U=1,V=1)=1/8=0.125_.


(5)  The correct answer is   YES:

  • The corresponding one-dimensional probability mass functions are:  
PU(U)=[1/2, 1/2],
PV(V)=[3/4, 1/4].
  • Thus:  PUV(U,V)=PU(U)PV(V)   ⇒   U  and  V  are statistically independent.