Loading [MathJax]/jax/output/HTML-CSS/fonts/TeX/fontdata.js

Exercise 4.7: Weighted Sum and Difference

From LNTwww
Revision as of 18:33, 25 February 2022 by Guenter (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Sum and difference of random variables

Let the random variables  u  and  v  be statistically independent of each other, each with mean  m  and variance  σ2.

  • Both variables have equal probability density function  (PDF)  and cumulative distribution function  (CDF).
  • Nothing is known about the course of these functions for the time being.


Now two new random variables  x  and  y  are formed according to the following equations:

x=Au+Bv,
y=AuBv.

Here,  A  and  B  denote  (any)  constant values.

  • For the subtasks  (1)  to  (4)  let   m=0,   σ=1,   A=1  and  B=2.
  • In subtask  (6)  u  and  v  are each uniformly distributed with  m=1  and  σ=0.5. For the constants,  A=B=1.
  • For subtask  (7)  it is still valid  A=B=1.  Here the random variables  u  and  v  are symmetrically two-point distributed on  ±1:
Pr(u=+1)=Pr(u=1)=Pr(v=+1)=Pr(v=1)=0.5.



Note:  The exercise belongs to the chapter  Linear Combinations of Random Variables.



Questions

1

What is the mean and the standard deviation of  x  for  A=1  and  B=2?

mx = 

σx = 

2

What is the mean and the standard deviation of  y  for  A=1  and  B=2?

my = 

σy = 

3

Calculate the covariance  μxy.  What value results for  A=1  and  B=2?

μxy = 

4

Calculate the correlation coefficient  ρxy  as a function of the quotient  B/A.  What coefficient results for  A=1  and  B=2?

ρxy = 

5

Which of the following statements is always true?

For  B=0  the random variables  x  and  y  are strictly correlated.
It holds  ρxy(B/A)=ρxy(B/A).
In the limiting case  B/A  the random variables  x  and  y  are strictly correlated.
For  A=B  the random variables x  and  y  are uncorrelated.

6

Which statements are true if  A=B=1  holds and  x  and  y  are each Gaussian distributed with mean  m=1  and standard deviation  σ=0.5?

The random variables x  and  y  are uncorrelated.
The random variables x  and  y  are statistically independent.

7

Which statements are true if  x  and  y  are symmetrically two-point distributed and  A=B=1  holds?

The random variables x  and  y  are uncorrelated.
The random variables x  and  y  are statistically independent.


Solution

(1)  Since the random variables  u  and  v  are zero mean  (m=0),  the random variable  x  is also zero mean:

mx=(A+B)m=0_.
  • For the variance and standard deviation:
σ2x=(A2+B2)σ2=5;σx=52.236_.


(2)  Since  u  and  v  have the same standard deviation,  so does  σy=σx2.236_.

  • Because  m=0  also  my=mx=0_.
  • For mean-valued random variable  u  and  v  on the other hand,  for  my=(AB)m  adds up to a different value than for  mx=(A+B)m.


(3)  We assume here in the sample solution the more general case  m0.  Then,  for the common moment holds:

mxy=E[xy]=E[(Au+Bv)(AuBv)].
  • According to the general calculation rules for expected values,  it follows:
mxy=A2E[u2]B2E[v2]=(A2B2)(m2+σ2).
  • This gives the covariance to
μxy=mxymxmy=(A2B2)(m2+σ2)(A+B)(AB)m2=(A2B2)σ2.
  • With  σ=1A=1  and  B=2  we get  μxy=3_.  Tthis is independent of the mean  m  of the variables  u  and  v.


correlation coefficient as a function of the quotient  B/A

(4)  The correlation coefficient is obtained as

ρxy=μxyσxσy=(A2B2)σ2(A2+B2)σ2ρxy=1(B/A)21+(B/A)2.
  • With  B/A=2  it follows  ρxy=0.6_.


(5)  Correct are statements 1, 3, and 4:

  • From  B=0  follows  ρxy=1  ("strict correlation").  It can be further seen that in this case  x=u  and  y=u  are identical random variables.
  • The second statement is not true:   For  A=1  and  B=2  also results  ρxy=0.6.
  • So the sign of the quotient does not matter because in the equation calculated in subtask  (4)'  the quotient  B/A  occurs only quadratically.
  • If  BA,  both  x  and  y  are determined almost exclusively by the random variable  v  and it is  yx.  This corresponds to the correlation coefficient  ρxy1.
  • In contrast,  B/A=1  always yields the correlation coefficient  ρxy=0  and thus the uncorrelatedness between  x  and  y.


(6)  Both statements are true:

  • When  A=B  ⇒   x  and  y  are always uncorrelated  (for any PDF of the variables u  and  v).
  • The new random variables  x  and  y  are therefore also distributed randomly.
  • For Gaussian randomness,  however,  statistical independence follows from uncorrelatedness,  and vice versa.


Joint PDF and edge PDFs

(7)  Here,  only statement 1 is true:

  • The correlation coefficient results with  A=B=1  to  ρxy=0.  That is:  x  and  y  are uncorrelated.
  • But it can be seen from the sketched two-dimensional PDF that the condition of statistical independence no longer applies in the present case:
fxy(x,y)fx(x)fy(y).