Exercise 4.7: Weighted Sum and Difference
From LNTwww
Let the random variables u and v be statistically independent of each other, each with mean m and variance σ2.
- Both variables have equal probability density function (PDF) and cumulative distribution function (CDF).
- Nothing is known about the course of these functions for the time being.
Now two new random variables x and y are formed according to the following equations:
- x=A⋅u+B⋅v,
- y=A⋅u−B⋅v.
Here, A and B denote (any) constant values.
- For the subtasks (1) to (4) let m=0, σ=1, A=1 and B=2.
- In subtask (6) u and v are each uniformly distributed with m=1 and σ=0.5. For the constants, A=B=1.
- For subtask (7) it is still valid A=B=1. Here the random variables u and v are symmetrically two-point distributed on ±1:
- Pr(u=+1)=Pr(u=−1)=Pr(v=+1)=Pr(v=−1)=0.5.
Note: The exercise belongs to the chapter Linear Combinations of Random Variables.
Questions
Solution
(1) Since the random variables u and v are zero mean (m=0), the random variable x is also zero mean:
- mx=(A+B)⋅m=0_.
- For the variance and standard deviation:
- σ2x=(A2+B2)⋅σ2=5;σx=√5≈2.236_.
(2) Since u and v have the same standard deviation, so does σy=σx≈2.236_.
- Because m=0 also my=mx=0_.
- For mean-valued random variable u and v on the other hand, for my=(A−B)⋅m adds up to a different value than for mx=(A+B)⋅m.
(3) We assume here in the sample solution the more general case m≠0. Then, for the common moment holds:
- mxy=E[x⋅y]=E[(A⋅u+B⋅v)(A⋅u−B⋅v)].
- According to the general calculation rules for expected values, it follows:
- mxy=A2⋅E[u2]−B2⋅E[v2]=(A2−B2)(m2+σ2).
- This gives the covariance to
- μxy=mxy−mx⋅my=(A2−B2)(m2+σ2)−(A+B)(A−B)⋅m2=(A2−B2)⋅σ2.
- With σ=1, A=1 and B=2 we get μxy=−3_. Tthis is independent of the mean m of the variables u and v.
(4) The correlation coefficient is obtained as
- ρxy=μxyσx⋅σy=(A2−B2)⋅σ2(A2+B2)⋅σ2⇒ρxy=1−(B/A)21+(B/A)2.
- With B/A=2 it follows ρxy=−0.6_.
(5) Correct are statements 1, 3, and 4:
- From B=0 follows ρxy=1 ("strict correlation"). It can be further seen that in this case x=u and y=u are identical random variables.
- The second statement is not true: For A=1 and B=−2 also results ρxy=−0.6.
- So the sign of the quotient does not matter because in the equation calculated in subtask (4)' the quotient B/A occurs only quadratically.
- If B≫A, both x and y are determined almost exclusively by the random variable v and it is y≈−x. This corresponds to the correlation coefficient ρxy≈−1.
- In contrast, B/A=1 always yields the correlation coefficient ρxy=0 and thus the uncorrelatedness between x and y.
(6) Both statements are true:
- When A=B ⇒ x and y are always uncorrelated (for any PDF of the variables u and v).
- The new random variables x and y are therefore also distributed randomly.
- For Gaussian randomness, however, statistical independence follows from uncorrelatedness, and vice versa.
(7) Here, only statement 1 is true:
- The correlation coefficient results with A=B=1 to ρxy=0. That is: x and y are uncorrelated.
- But it can be seen from the sketched two-dimensional PDF that the condition of statistical independence no longer applies in the present case:
- fxy(x,y)≠fx(x)⋅fy(y).