Loading [MathJax]/jax/output/HTML-CSS/fonts/TeX/fontdata.js

Difference between revisions of "Aufgaben:Exercise 3.2Z: Two-dimensional Probability Mass Function"

From LNTwww
 
(One intermediate revision by one other user not shown)
Line 1: Line 1:
  
{{quiz-Header|Buchseite=Informationstheorie/Einige Vorbemerkungen zu zweidimensionalen Zufallsgrößen
+
{{quiz-Header|Buchseite=Information_Theory/Some_Preliminary_Remarks_on_Two-Dimensional_Random_Variables
 
}}
 
}}
  
Line 22: Line 22:
  
  
Hints:  
+
<u>Hints:</u>
 
*The exercise belongs to the chapter&nbsp; [[Information_Theory/Einige_Vorbemerkungen_zu_zweidimensionalen_Zufallsgrößen|Some preliminary remarks on two-dimensional random variables]].
 
*The exercise belongs to the chapter&nbsp; [[Information_Theory/Einige_Vorbemerkungen_zu_zweidimensionalen_Zufallsgrößen|Some preliminary remarks on two-dimensional random variables]].
 
*The same constellation is assumed here as in&nbsp; [[Aufgaben:Aufgabe_3.2:_Erwartungswertberechnungen|Exercise 3.2]].
 
*The same constellation is assumed here as in&nbsp; [[Aufgaben:Aufgabe_3.2:_Erwartungswertberechnungen|Exercise 3.2]].
Line 72: Line 72:
 
===Solution===
 
===Solution===
 
{{ML-Kopf}}
 
{{ML-Kopf}}
'''(1)'''&nbsp; You get from&nbsp;  PXY(X, Y)&nbsp; to the 1D probability function&nbsp; PX(X) by summing up all&nbsp; Y probabilities:
+
'''(1)'''&nbsp; You get from&nbsp;  PXY(X, Y)&nbsp; to the one-dimensional probability mass function&nbsp; PX(X) by summing up all&nbsp; Y probabilities:
 
:PX(X=xμ)=yYPXY(xμ,y).
 
:PX(X=xμ)=yYPXY(xμ,y).
 
*One thus obtains the following numerical values:
 
*One thus obtains the following numerical values:
Line 91: Line 91:
  
 
'''(3)'''&nbsp; With statistical independence,&nbsp;  PXY(X,Y)=PX(X)PY(Y)&nbsp; should be.
 
'''(3)'''&nbsp; With statistical independence,&nbsp;  PXY(X,Y)=PX(X)PY(Y)&nbsp; should be.
*This does not apply here: &nbsp; &nbsp;  answer &nbsp; <u>'''no'''</u>.
+
*This does not apply here: &nbsp; &nbsp;  answer &nbsp; <u>'''NO'''</u>.
  
  
  
'''(4)'''&nbsp; Starting from the left-hand table &nbsp; &rArr; &nbsp;  PXY(X,Y)&nbsp; , we arrive at the middle table &nbsp; &rArr; &nbsp; PUY(U,Y), <br>by combining certain probabilities according to&nbsp; U=Xmod2&nbsp;.  
+
'''(4)'''&nbsp; Starting from the left-hand table &nbsp; &rArr; &nbsp;  PXY(X,Y),&nbsp; we arrive at the middle table &nbsp; &rArr; &nbsp; PUY(U,Y), <br>by combining certain probabilities according to&nbsp; U=Xmod2.  
  
 +
If one also takes into account&nbsp; V=Ymod2, one obtains the probabilities sought according to the right-hand table:
 
[[File:P_ID2753__Inf_Z_3_2d_neu.png|right|frame|Different probability functions]]
 
[[File:P_ID2753__Inf_Z_3_2d_neu.png|right|frame|Different probability functions]]
  
If one also takes into account&nbsp; V=Ymod2, one obtains the probabilities sought according to the right-hand table:
 
 
:PUV(U=0,V=0)=3/8=0.375_,
 
:PUV(U=0,V=0)=3/8=0.375_,
 
:PUV(U=0,V=1)=3/8=0.375_,
 
:PUV(U=0,V=1)=3/8=0.375_,
Line 107: Line 107:
  
  
'''(5)'''&nbsp; The correct answer is &nbsp; <u>'''yes'''</u>:
+
'''(5)'''&nbsp; The correct answer is &nbsp; <u>'''YES'''</u>:
*The corresponding 1D probability functions are:  &nbsp;  
+
*The corresponding one-dimensional probability mass functions are:  &nbsp;  
 
:PU(U)=[1/2, 1/2],
 
:PU(U)=[1/2, 1/2],
 
:PV(V)=[3/4, 1/4].   
 
:PV(V)=[3/4, 1/4].   

Latest revision as of 10:12, 24 September 2021

PMF of the two-dimensional random variable  XY

We consider the random variables  X={0, 1, 2, 3}  and  Y={0, 1, 2}, whose joint probability mass function  PXY(X, Y)  is given.

  • From this two-dimensional probability mass function  (PMF),  the one-dimensional probability mass functions  PX(X)  and  PY(Y)  are to be determined.
  • Such a one-dimensional probability mass function is sometimes also called  "marginal probability".


If  PXY(X, Y)=PX(X)PY(Y), the two random variables  X  and  Y  are statistically independent.  Otherwise, there are statistical dependencies between them.

In the second part of the task we consider the random variables  U={0, 1}  and  V={0, 1},  which result from  X  and  Y  by modulo-2 operations:

U=Xmod2,V=Ymod2.





Hints:

  • The exercise belongs to the chapter  Some preliminary remarks on two-dimensional random variables.
  • The same constellation is assumed here as in  Exercise 3.2.
  • There the random variables  Y={0, 1, 2, 3}  were considered, but with the addition  Pr(Y=3)=0.
  • The property  |X|=|Y|  forced in this way was advantageous in the previous task for the formal calculation of the expected value.


Questions

1

What is the probability mass function  PX(X)?

PX(0) = 

PX(1) = 

PX(2) = 

PX(3) = 

2

What is the probability mass function  PY(Y)?

PY(0) = 

PY(1) = 

PY(2) = 

3

Are the random variables  X  and  Y  statistically independent?

Yes,
No.

4

Determine the probabilities  PUV(U, V).

PUV(U=0, V=0) = 

PUV(U=0, V=1) = 

PUV(U=1, V=0) = 

PUV(U=1, V=1) = 

5

Are the random variables  U  and  V  statistically independent?

Yes,
No.


Solution

(1)  You get from  PXY(X, Y)  to the one-dimensional probability mass function  PX(X) by summing up all  Y probabilities:

PX(X=xμ)=yYPXY(xμ,y).
  • One thus obtains the following numerical values:
PX(X=0)=1/4+1/8+1/8=1/2=0.500_,
PX(X=1)=0+0+1/8=1/8=0.125_,
PX(X=2)=0+0+0=0_
PX(X=3)=1/4+1/8+0=3/8=0.375_PX(X)=[1/2, 1/8, 0, 3/8].


(2)  Analogous to sub-task   (1) , the following now holds:

PY(Y=yκ)=xXPXY(x,yκ)
PY(Y=0)=1/4+0+0+1/4=1/2=0.500_,
PY(Y=1)=1/8+0+0+1/8=1/4=0.250_,
PY(Y=2)=1/8+1/8+0+0=1/4=0.250_PY(Y=0)=[1/2, 1/4, 1/4].


(3)  With statistical independence,  PXY(X,Y)=PX(X)PY(Y)  should be.

  • This does not apply here:     answer   NO.


(4)  Starting from the left-hand table   ⇒   PXY(X,Y),  we arrive at the middle table   ⇒   PUY(U,Y),
by combining certain probabilities according to  U=Xmod2.

If one also takes into account  V=Ymod2, one obtains the probabilities sought according to the right-hand table:

Different probability functions
PUV(U=0,V=0)=3/8=0.375_,
PUV(U=0,V=1)=3/8=0.375_,
PUV(U=1,V=0)=1/8=0.125_,
PUV(U=1,V=1)=1/8=0.125_.


(5)  The correct answer is   YES:

  • The corresponding one-dimensional probability mass functions are:  
PU(U)=[1/2, 1/2],
PV(V)=[3/4, 1/4].
  • Thus:  PUV(U,V)=PU(U)PV(V)   ⇒   U  and  V  are statistically independent.