Difference between revisions of "Aufgaben:Exercise 3.2Z: Two-dimensional Probability Mass Function"
Line 1: | Line 1: | ||
− | {{quiz-Header|Buchseite= | + | {{quiz-Header|Buchseite=Information_Theory/Some_Preliminary_Remarks_on_Two-Dimensional_Random_Variables |
}} | }} | ||
Latest revision as of 10:12, 24 September 2021
We consider the random variables X={0, 1, 2, 3} and Y={0, 1, 2}, whose joint probability mass function PXY(X, Y) is given.
- From this two-dimensional probability mass function (PMF), the one-dimensional probability mass functions PX(X) and PY(Y) are to be determined.
- Such a one-dimensional probability mass function is sometimes also called "marginal probability".
If PXY(X, Y)=PX(X)⋅PY(Y), the two random variables X and Y are statistically independent. Otherwise, there are statistical dependencies between them.
In the second part of the task we consider the random variables U={0, 1} and V={0, 1}, which result from X and Y by modulo-2 operations:
- U=Xmod2,V=Ymod2.
Hints:
- The exercise belongs to the chapter Some preliminary remarks on two-dimensional random variables.
- The same constellation is assumed here as in Exercise 3.2.
- There the random variables Y={0, 1, 2, 3} were considered, but with the addition Pr(Y=3)=0.
- The property |X|=|Y| forced in this way was advantageous in the previous task for the formal calculation of the expected value.
Questions
Solution
- PX(X=xμ)=∑y∈YPXY(xμ,y).
- One thus obtains the following numerical values:
- PX(X=0)=1/4+1/8+1/8=1/2=0.500_,
- PX(X=1)=0+0+1/8=1/8=0.125_,
- PX(X=2)=0+0+0=0_
- PX(X=3)=1/4+1/8+0=3/8=0.375_⇒PX(X)=[1/2, 1/8, 0, 3/8].
(2) Analogous to sub-task (1) , the following now holds:
- PY(Y=yκ)=∑x∈XPXY(x,yκ)
- PY(Y=0)=1/4+0+0+1/4=1/2=0.500_,
- PY(Y=1)=1/8+0+0+1/8=1/4=0.250_,
- PY(Y=2)=1/8+1/8+0+0=1/4=0.250_⇒PY(Y=0)=[1/2, 1/4, 1/4].
(3) With statistical independence, PXY(X,Y)=PX(X)⋅PY(Y) should be.
- This does not apply here: answer NO.
(4) Starting from the left-hand table ⇒ PXY(X,Y), we arrive at the middle table ⇒ PUY(U,Y),
by combining certain probabilities according to U=Xmod2.
If one also takes into account V=Ymod2, one obtains the probabilities sought according to the right-hand table:
- PUV(U=0,V=0)=3/8=0.375_,
- PUV(U=0,V=1)=3/8=0.375_,
- PUV(U=1,V=0)=1/8=0.125_,
- PUV(U=1,V=1)=1/8=0.125_.
(5) The correct answer is YES:
- The corresponding one-dimensional probability mass functions are:
- PU(U)=[1/2, 1/2],
- PV(V)=[3/4, 1/4].
- Thus: PUV(U,V)=PU(U)⋅PV(V) ⇒ U and V are statistically independent.