Difference between revisions of "Aufgaben:Exercise 3.2Z: Two-dimensional Probability Mass Function"
m (Text replacement - "Category:Aufgaben zu Informationstheorie" to "Category:Information Theory: Exercises") |
|||
(6 intermediate revisions by 2 users not shown) | |||
Line 1: | Line 1: | ||
− | {{quiz-Header|Buchseite= | + | {{quiz-Header|Buchseite=Information_Theory/Some_Preliminary_Remarks_on_Two-Dimensional_Random_Variables |
}} | }} | ||
− | [[File:P_ID2752__Inf_Z_3_2_neu.png|right|frame| | + | [[File:P_ID2752__Inf_Z_3_2_neu.png|right|frame|PMF of the two-dimensional random variable XY]] |
− | + | We consider the random variables X={0, 1, 2, 3} and Y={0, 1, 2}, whose joint probability mass function PXY(X, Y) is given. | |
− | * | + | *From this two-dimensional probability mass function (PMF), the one-dimensional probability mass functions PX(X) and PY(Y) are to be determined. |
− | * | + | *Such a one-dimensional probability mass function is sometimes also called "marginal probability". |
− | + | If PXY(X, Y)=PX(X)⋅PY(Y), the two random variables X and Y are statistically independent. Otherwise, there are statistical dependencies between them. | |
− | + | In the second part of the task we consider the random variables U={0, 1} and V={0, 1}, which result from X and Y by modulo-2 operations: | |
:U=Xmod2,V=Ymod2. | :U=Xmod2,V=Ymod2. | ||
Line 22: | Line 22: | ||
− | + | <u>Hints:</u> | |
− | * | + | *The exercise belongs to the chapter [[Information_Theory/Einige_Vorbemerkungen_zu_zweidimensionalen_Zufallsgrößen|Some preliminary remarks on two-dimensional random variables]]. |
− | * | + | *The same constellation is assumed here as in [[Aufgaben:Aufgabe_3.2:_Erwartungswertberechnungen|Exercise 3.2]]. |
− | * | + | *There the random variables Y={0, 1, 2, 3} were considered, but with the addition Pr(Y=3)=0. |
− | * | + | *The property |X|=|Y| forced in this way was advantageous in the previous task for the formal calculation of the expected value. |
− | === | + | ===Questions=== |
<quiz display=simple> | <quiz display=simple> | ||
− | { | + | {What is the probability mass function PX(X)? |
|type="{}"} | |type="{}"} | ||
PX(0) = { 0.5 3% } | PX(0) = { 0.5 3% } | ||
Line 41: | Line 41: | ||
PX(3) = { 0.375 3% } | PX(3) = { 0.375 3% } | ||
− | { | + | {What is the probability mass function PY(Y)? |
|type="{}"} | |type="{}"} | ||
PY(0) = { 0.5 3% } | PY(0) = { 0.5 3% } | ||
Line 47: | Line 47: | ||
PY(2) = { 0.25 3% } | PY(2) = { 0.25 3% } | ||
− | { | + | {Are the random variables X and Y statistically independent? |
+ | |||
|type="()"} | |type="()"} | ||
− | - | + | - Yes, |
− | + | + | + No. |
− | { | + | {Determine the probabilities PUV(U, V). |
|type="{}"} | |type="{}"} | ||
PUV(U=0, V=0) = { 0.375 3% } | PUV(U=0, V=0) = { 0.375 3% } | ||
Line 60: | Line 61: | ||
PUV(U=1, V=1) = { 0.125 3% } | PUV(U=1, V=1) = { 0.125 3% } | ||
− | { | + | {Are the random variables U and V statistically independent? |
|type="()"} | |type="()"} | ||
− | + | + | + Yes, |
− | - | + | - No. |
Line 69: | Line 70: | ||
</quiz> | </quiz> | ||
− | === | + | ===Solution=== |
{{ML-Kopf}} | {{ML-Kopf}} | ||
− | '''(1)''' | + | '''(1)''' You get from PXY(X, Y) to the one-dimensional probability mass function PX(X) by summing up all Y probabilities: |
:PX(X=xμ)=∑y∈YPXY(xμ,y). | :PX(X=xμ)=∑y∈YPXY(xμ,y). | ||
− | * | + | *One thus obtains the following numerical values: |
:PX(X=0)=1/4+1/8+1/8=1/2=0.500_, | :PX(X=0)=1/4+1/8+1/8=1/2=0.500_, | ||
:PX(X=1)=0+0+1/8=1/8=0.125_, | :PX(X=1)=0+0+1/8=1/8=0.125_, | ||
Line 81: | Line 82: | ||
− | '''(2)''' | + | '''(2)''' Analogous to sub-task '''(1)''' , the following now holds: |
:PY(Y=yκ)=∑x∈XPXY(x,yκ) | :PY(Y=yκ)=∑x∈XPXY(x,yκ) | ||
:PY(Y=0)=1/4+0+0+1/4=1/2=0.500_, | :PY(Y=0)=1/4+0+0+1/4=1/2=0.500_, | ||
Line 89: | Line 90: | ||
− | '''(3)''' | + | '''(3)''' With statistical independence, PXY(X,Y)=PX(X)⋅PY(Y) should be. |
− | * | + | *This does not apply here: answer <u>'''NO'''</u>. |
− | '''(4)''' | + | '''(4)''' Starting from the left-hand table ⇒ PXY(X,Y), we arrive at the middle table ⇒ PUY(U,Y), <br>by combining certain probabilities according to U=Xmod2. |
− | [[File:P_ID2753__Inf_Z_3_2d_neu.png|right|frame| | + | If one also takes into account V=Ymod2, one obtains the probabilities sought according to the right-hand table: |
+ | [[File:P_ID2753__Inf_Z_3_2d_neu.png|right|frame|Different probability functions]] | ||
− | |||
:PUV(U=0,V=0)=3/8=0.375_, | :PUV(U=0,V=0)=3/8=0.375_, | ||
:PUV(U=0,V=1)=3/8=0.375_, | :PUV(U=0,V=1)=3/8=0.375_, | ||
Line 106: | Line 107: | ||
− | '''(5)''' | + | '''(5)''' The correct answer is <u>'''YES'''</u>: |
− | * | + | *The corresponding one-dimensional probability mass functions are: |
:PU(U)=[1/2, 1/2], | :PU(U)=[1/2, 1/2], | ||
:PV(V)=[3/4, 1/4]. | :PV(V)=[3/4, 1/4]. | ||
− | * | + | *Thus: PUV(U,V)=PU(U)⋅PV(V) ⇒ U and V are statistically independent. |
Line 124: | Line 125: | ||
− | [[Category:Information Theory: Exercises|^3.1 | + | [[Category:Information Theory: Exercises|^3.1 General Information on 2D Random Variables^]] |
Latest revision as of 10:12, 24 September 2021
We consider the random variables X={0, 1, 2, 3} and Y={0, 1, 2}, whose joint probability mass function PXY(X, Y) is given.
- From this two-dimensional probability mass function (PMF), the one-dimensional probability mass functions PX(X) and PY(Y) are to be determined.
- Such a one-dimensional probability mass function is sometimes also called "marginal probability".
If PXY(X, Y)=PX(X)⋅PY(Y), the two random variables X and Y are statistically independent. Otherwise, there are statistical dependencies between them.
In the second part of the task we consider the random variables U={0, 1} and V={0, 1}, which result from X and Y by modulo-2 operations:
- U=Xmod2,V=Ymod2.
Hints:
- The exercise belongs to the chapter Some preliminary remarks on two-dimensional random variables.
- The same constellation is assumed here as in Exercise 3.2.
- There the random variables Y={0, 1, 2, 3} were considered, but with the addition Pr(Y=3)=0.
- The property |X|=|Y| forced in this way was advantageous in the previous task for the formal calculation of the expected value.
Questions
Solution
- PX(X=xμ)=∑y∈YPXY(xμ,y).
- One thus obtains the following numerical values:
- PX(X=0)=1/4+1/8+1/8=1/2=0.500_,
- PX(X=1)=0+0+1/8=1/8=0.125_,
- PX(X=2)=0+0+0=0_
- PX(X=3)=1/4+1/8+0=3/8=0.375_⇒PX(X)=[1/2, 1/8, 0, 3/8].
(2) Analogous to sub-task (1) , the following now holds:
- PY(Y=yκ)=∑x∈XPXY(x,yκ)
- PY(Y=0)=1/4+0+0+1/4=1/2=0.500_,
- PY(Y=1)=1/8+0+0+1/8=1/4=0.250_,
- PY(Y=2)=1/8+1/8+0+0=1/4=0.250_⇒PY(Y=0)=[1/2, 1/4, 1/4].
(3) With statistical independence, PXY(X,Y)=PX(X)⋅PY(Y) should be.
- This does not apply here: answer NO.
(4) Starting from the left-hand table ⇒ PXY(X,Y), we arrive at the middle table ⇒ PUY(U,Y),
by combining certain probabilities according to U=Xmod2.
If one also takes into account V=Ymod2, one obtains the probabilities sought according to the right-hand table:
- PUV(U=0,V=0)=3/8=0.375_,
- PUV(U=0,V=1)=3/8=0.375_,
- PUV(U=1,V=0)=1/8=0.125_,
- PUV(U=1,V=1)=1/8=0.125_.
(5) The correct answer is YES:
- The corresponding one-dimensional probability mass functions are:
- PU(U)=[1/2, 1/2],
- PV(V)=[3/4, 1/4].
- Thus: PUV(U,V)=PU(U)⋅PV(V) ⇒ U and V are statistically independent.