Difference between revisions of "Aufgaben:Exercise 4.7: Weighted Sum and Difference"

From LNTwww
 
(20 intermediate revisions by 4 users not shown)
Line 1: Line 1:
  
{{quiz-Header|Buchseite=Stochastische Signaltheorie/Linearkombinationen von Zufallsgrößen
+
{{quiz-Header|Buchseite=Theory_of_Stochastic_Signals/Linear_Combinations_of_Random_Variables
 
}}
 
}}
  
[[File:P_ID400__Sto_A_4_7.png|right|Summe und Differenz von Zufallsgrößen]]
+
[[File:P_ID400__Sto_A_4_7.png|right|frame|Sum and difference of random variables]]
:Die Zufallsgr&ouml;&szlig;en <i>u</i> und <i>&upsilon;</i> seien statistisch voneinander unabh&auml;ngig, jeweils mit Mittelwert <i>m</i> und Varianz <i>&sigma;</i><sup>2</sup>. Beide Größen besitzen gleiche WDF und VTF; über den Verlauf dieser Funktionen sei zun&auml;chst nichts bekannt.
+
Let the random variables&nbsp; $u$&nbsp; and&nbsp; $v$&nbsp; be statistically independent of each other, each with mean&nbsp; $m$&nbsp; and variance&nbsp; $\sigma^2$.  
 +
*Both variables have equal probability density function&nbsp; $\rm (PDF)$&nbsp; and cumulative distribution function&nbsp; $\rm (CDF)$.
 +
*Nothing is known about the course of these functions for the time being.
  
:Es werden nun zwei neue Zufallsgr&ouml;&szlig;en <i>x</i> und <i>y</i> entsprechend den nachfolgenden Gleichungen gebildet:
+
 
 +
Now two new random variables&nbsp; $x$&nbsp; and&nbsp; $y$&nbsp; are formed according to the following equations:
 
:$$x = A \cdot u + B \cdot v,$$
 
:$$x = A \cdot u + B \cdot v,$$
 
:$$y= A \cdot u - B \cdot v.$$
 
:$$y= A \cdot u - B \cdot v.$$
  
:Hierbei bezeichnen <i>A</i> und <i>B</i> (beliebige) konstante Werte. F&uuml;r die Teilaufgaben (1) bis (4) gelte <i>m</i> = 0, <i>&sigma;</i> = 1, <i>A</i> = 1 und <i>B</i> = 2.
+
Here,&nbsp; $A$&nbsp; and&nbsp; $B$&nbsp; denote&nbsp; (any)&nbsp; constant values.  
 +
*For the subtasks&nbsp; '''(1)'''&nbsp; to&nbsp; '''(4)'''&nbsp; let &nbsp; $m= 0$, &nbsp; $\sigma = 1$, &nbsp; $A = 1$&nbsp; and&nbsp; $B = 2$.
 +
*In subtask&nbsp; '''(6)'''&nbsp;  $u$&nbsp; and&nbsp; $v$&nbsp; are each uniformly distributed with&nbsp; $m= 1$&nbsp; and&nbsp; $\sigma = 0.5$. For the constants,&nbsp; $A = B = 1$.
 +
*For subtask&nbsp; '''(7)'''&nbsp; it is still valid&nbsp; $A = B = 1$.&nbsp; Here the random variables&nbsp; $u$&nbsp; and&nbsp; $v$&nbsp; are symmetrically two-point distributed on&nbsp; $\pm$1:
 +
:$${\rm Pr}(u=+1) = {\rm Pr}(u=-1) = {\rm Pr}(v=+1) = {\rm Pr}(v=-1) =0.5.$$
 +
 
  
:Bei der Teilaufgabe (5) wird vorausgesetzt, dass  <i>u</i> und <i>&upsilon;</i> jeweils gau&szlig;verteilt mit Mittelwert <i>m</i> = 1 und Streuung <i>&sigma;</i> = 0.5 seien. F&uuml;r die Konstanten gelte <i>A</i> = <i>B</i> = 1.
 
  
:F&uuml;r die Aufgabe (6) seien die Zufallsgr&ouml;&szlig;en <i>u</i> und <i>&upsilon;</i> symmetrisch zweipunktverteilt auf &plusmn;1:
 
:$${\rm Pr}(u=1) = {\rm Pr}(u=-1) = {\rm Pr}(v=1) = {\rm Pr}(v=-1) =0.5.$$
 
  
:Außerdem gelte weiterhin <i>A</i> = <i>B</i> = 1.
+
Note:&nbsp; The exercise belongs to the chapter&nbsp; [[Theory_of_Stochastic_Signals/Linear_Combinations_of_Random_Variables|Linear Combinations of Random Variables]].
 +
  
''Hinweise:''
 
*Die Aufgabe gehört zum  Kapitel [[Stochastische_Signaltheorie/Zweidimensionale_Gaußsche_Zufallsgrößen|Zweidimensionale Gaußsche Zufallsgrößen]].
 
*Bezug genommen wird insbesondere auf die Seite [[Stochastische_Signaltheorie/Zweidimensionale_Gaußsche_Zufallsgrößen#Drehung_des_Koordinatensystems|Drehung des Koordinatensystems]].
 
*Sollte die Eingabe des Zahlenwertes &bdquo;0&rdquo; erforderlich sein, so geben Sie bitte &bdquo;0.&rdquo; ein.
 
*Gegeben sind die Näherungen ${\rm Q}(2.3) \approx 0.01$ und ${\rm Q}(2.6) \approx 0.005$ für das komplementäre Gaußsche Fehlerintegral.
 
*Nachfolgend gibt es Hyperlinks zu zwei Lernvideos, die diese Thematik behandeln:
 
:[[Gaußsche Zufallsgrößen ohne statistische Bindungen]]
 
:[[Gaußsche Zufallsgrößen mit statistischen Bindungen]]
 
:<b>Hinweis:</b> Diese Aufgabe bezieht sich auf die theoretischen Grundlagen von Kapitel 4.3.
 
  
  
===Fragebogen===
+
===Questions===
  
 
<quiz display=simple>
 
<quiz display=simple>
{Wie groß sind Mittelwert und Streuung von <i>x</i> f&uuml;r <i>A</i> = 1 und <i>B</i> = 2?
+
{What is the mean and the standard deviation of &nbsp;$x$&nbsp; for&nbsp; $A = 1$&nbsp; and&nbsp; $B = 2$?
 
|type="{}"}
 
|type="{}"}
$m_x$ = { 0 3% }
+
$m_x \ = \ $ { 0. }
$\sigma_x$ = { 2.236 3% }
+
$\sigma_x \ = \ $ { 2.236 3% }
  
  
{Wie gro&szlig; sind Mittelwert und Streuung von <i>y</i> f&uuml;r <i>A</i> = 1 und <i>B</i> = 2?
+
{What is the mean and the standard deviation of &nbsp;$y$&nbsp; for&nbsp; $A = 1$&nbsp; and&nbsp; $B = 2$?
 
|type="{}"}
 
|type="{}"}
$m_y$ = { 0 3% }
+
$m_y \ = \ $ { 0. }
$\sigma_y$ = { 2.236 3% }
+
$\sigma_y \ = \ $ { 2.236 3% }
  
  
{Berechnen Sie die Kovarianz. Welcher Wert ergibt sich f&uuml;r <i>A</i> = 1, <i>B</i> = 2?
+
{Calculate the covariance&nbsp; $\mu_{xy}$.&nbsp; What value results for&nbsp; $A = 1$&nbsp; and&nbsp; $B = 2$?
 
|type="{}"}
 
|type="{}"}
$\mu_\text{xy}$ = - { 3 3% }
+
$\mu_{xy} \ = \ $ { -3.09--2.91 }
  
  
{Berechnen Sie den Korrelationskoeffizienten <i>&rho;<sub>xy</sub></i> in Abh&auml;ngigkeit des Quotienten <i>B</i>/<i>A</i>. Welcher Koeffizient ergibt sich f&uuml;r <i>A</i> = 1 und <i>B</i> = 2?
+
{Calculate the correlation coefficient&nbsp; $\rho_{xy}$&nbsp; as a function of the quotient&nbsp; $B/A$.&nbsp; What coefficient results for&nbsp; $A = 1$&nbsp; and&nbsp; $B = 2$?
 
|type="{}"}
 
|type="{}"}
$\rho_\text{xy}$ = - { 0.6 3% }
+
$\rho_{xy}\ = \ $ { -0.618--0.582 }
  
  
{Welche der folgenden Aussagen gelten immer?
+
{Which of the following statements is always true?
 
|type="[]"}
 
|type="[]"}
+ F&uuml;r <i>B</i> = 0 sind die Zufallsgr&ouml;&szlig;en <i>x</i> und <i>y</i> streng korreliert.
+
+ For &nbsp;$B = 0$&nbsp; the random variables &nbsp;$x$&nbsp; and &nbsp;$y$&nbsp; are strictly correlated.
- Es gilt <i>&rho;<sub>xy</sub></i>(&ndash;<i>B</i>/<i>A</i>) = &ndash;<i>&rho;<sub>xy</sub></i>(<i>B</i>/<i>A</i>).
+
- It holds &nbsp;$\rho_{xy}(-B/A) = -\rho_{xy}(B/A)$.
+ Im Grenzfall <i>B</i>/<i>A</i> &#8594; &#8734; sind <i>x</i> und <i>y</i> streng korreliert.
+
+ In the limiting case&nbsp; $B/A \to \infty$&nbsp; the random variables &nbsp;$x$&nbsp; and &nbsp;$y$&nbsp; are strictly correlated.
+ F&uuml;r <i>A</i> = <i>B</i> sind die Zufallsgr&ouml;&szlig;en <i>x</i> und <i>y</i> unkorreliert.
+
+ For&nbsp; $A =B$&nbsp; the random variables&nbsp;$x$&nbsp; and &nbsp;$y$&nbsp; are uncorrelated.
  
  
{Welche Aussagen sind zutreffend, wenn <i>A</i> = <i>B</i> = 1 gilt und <i>u</i> und <i>&upsilon;</i> jeweils gau&szlig;verteilt sind mit Mittelwert <i>m</i> = 1 und Streuung <i>&sigma;</i> = 0.5?
+
{Which statements are true if&nbsp; $A =B = 1$&nbsp; holds and &nbsp;$x$&nbsp; and &nbsp;$y$&nbsp; are each Gaussian distributed with mean &nbsp;$m = 1$&nbsp; and standard deviation &nbsp;$\sigma = 0.5$?
 
|type="[]"}
 
|type="[]"}
+ Die Zufallsgr&ouml;&szlig;en <i>x</i> und <i>y</i> sind unkorreliert.
+
+ The random variables&nbsp;$x$&nbsp; and &nbsp;$y$&nbsp; are uncorrelated.
+ Die Zufallsgr&ouml;&szlig;en <i>x</i> und <i>y</i> sind statistisch unabh&auml;ngig.
+
+ The random variables&nbsp;$x$&nbsp; and &nbsp;$y$&nbsp; are statistically independent.
  
  
{Welche Aussagen treffen zu, wenn <i>u</i> und <i>&upsilon;</i> symmetrisch zweipunktverteilt sind und <i>A</i> = <i>B</i> = 1 gilt?
+
{Which statements are true if &nbsp;$x$&nbsp; and &nbsp;$y$&nbsp; are symmetrically two-point distributed and&nbsp; $A =B = 1$&nbsp; holds?
 
|type="[]"}
 
|type="[]"}
+ Die Zufallsgr&ouml;&szlig;en <i>x</i> und <i>y</i> sind unkorreliert.
+
+ The random variables&nbsp;$x$&nbsp; and &nbsp;$y$&nbsp; are uncorrelated.
+ Die Zufallsgr&ouml;&szlig;en <i>x</i> und <i>y</i> sind statistisch unabh&auml;ngig.
+
- The random variables&nbsp;$x$&nbsp; and &nbsp;$y$&nbsp; are statistically independent.
  
  
 
</quiz>
 
</quiz>
  
===Musterlösung===
+
===Solution===
 
{{ML-Kopf}}
 
{{ML-Kopf}}
:<b>1.</b> Da die Zufallsgr&ouml;&szlig;en <i>u</i> und <i>&upsilon;</i> mittelwertfrei sind (<i>m</i> = 0), ist  auch die Zufallsgr&ouml;&szlig;e <i>x</i> mittelwertfrei: <i>m<sub>x</sub></i> = (<i>A</i> + <i>B</i>) &middot; <i>m</i> <u>= 0</u>. F&uuml;r die Varianz und die Streuung gelten:
+
'''(1)'''&nbsp; Since the random variables&nbsp; $u$&nbsp; and&nbsp; $v$&nbsp; are zero mean&nbsp; $(m = 0)$,&nbsp; the random variable&nbsp; $x$&nbsp; is also zero mean:
 +
:$$m_x = (A +B) \cdot m \hspace{0.15cm}\underline{ =0}.$$
 +
*For the variance and standard deviation:
 
:$$\sigma_x^2 = (A^2 +B^2) \cdot \sigma^2 = 5; \hspace{0.5cm} \sigma_x = \sqrt{5}\hspace{0.15cm}\underline{ \approx 2.236}.$$
 
:$$\sigma_x^2 = (A^2 +B^2) \cdot \sigma^2 = 5; \hspace{0.5cm} \sigma_x = \sqrt{5}\hspace{0.15cm}\underline{ \approx 2.236}.$$
  
:<b>2.</b> Da <i>u</i> und <i>&upsilon;</i> die gleiche Streuung besitzen, gilt auch <i>&sigma;<sub>y</sub></i> =  <i>&sigma;<sub>x</sub></i> <u>&asymp; 2.236</u>. Wegen <i>m</i> = 0 gilt zudem <u><i>m<sub>y</sub></i> = 0</u>. Bei mittelwertbehafteten Zufallsgr&ouml;&szlig;en <i>u</i> und <i>&upsilon;</i> erg&auml;be sich f&uuml;r <i>m<sub>y</sub></i> = (<i>A</i> &ndash; <i>B</i>) &middot; <i>m</i> dagegen ein anderer Wert als f&uuml;r <i>m<sub>x</sub></i> = (<i>A</i> + <i>B</i>) &middot; <i>m</i>.
 
  
:<b>3.</b> Wir gehen hier von dem allgemeineren Fall <i>m</i> &ne; 0 aus. Dann gilt f&uuml;r das gemeinsame Moment:
 
:$$m_{xy} = {\rm E} [x \cdot y ] = {\rm E} [(A \cdot u + B \cdot v) (A \cdot u - B \cdot v)] . $$
 
  
:Nach den allgemeinen Rechenregeln f&uuml;r Erwartungswerte folgt daraus:
+
'''(2)'''&nbsp; Since&nbsp; $u$&nbsp; and&nbsp; $v$&nbsp; have the same standard deviation,&nbsp; so does&nbsp; $\sigma_y =\sigma_x \hspace{0.15cm}\underline{ \approx 2.236}$.
:$$m_{xy} = A^2 \cdot {\rm E} [u^2 ] - B^2 \cdot {\rm E} [v^2 ] = (A^2 - B^2)(m^2 + \sigma^2).$$
+
*Because&nbsp; $m=0$&nbsp; also&nbsp; $m_y = m_x \hspace{0.15cm}\underline{ =0}$.
 +
*For mean-valued random variable&nbsp; $u$&nbsp; and&nbsp; $v$&nbsp; on the other hand,&nbsp; for&nbsp; $m_y = (A -B) \cdot m$&nbsp; adds up to a different value than for&nbsp; $m_x = (A +B) \cdot m$.
 +
 
 +
 
 +
 
 +
'''(3)'''&nbsp; We assume here in the sample solution the more general case&nbsp; $m \ne 0$.&nbsp; Then,&nbsp; for the common moment holds:
 +
:$$m_{xy} = {\rm E} \big[x \cdot y \big] = {\rm E} \big[(A \cdot u + B \cdot v) (A \cdot u - B \cdot v)\big] . $$
 +
 
 +
*According to the general calculation rules for expected values,&nbsp; it follows:
 +
:$$m_{xy} = A^2 \cdot {\rm E} \big[u^2 \big] - B^2 \cdot {\rm E} \big[v^2 \big] = (A^2 - B^2)(m^2 + \sigma^2).$$
 +
 
 +
*This gives the covariance to
 +
:$$\mu_{xy} = m_{xy} - m_{x} \cdot m_{y}= (A^2 - B^2)(m^2 + \sigma^2) - (A + B)(A-B) \cdot m^2 = (A^2 - B^2) \cdot \sigma^2.$$
 +
 
 +
*With&nbsp; $\sigma = 1$,&nbsp; $A = 1$&nbsp; and&nbsp; $B = 2$&nbsp; we get&nbsp; $\mu_{xy}  \hspace{0.15cm}\underline{ =-3}$.&nbsp; Tthis is independent of the mean&nbsp; $m$&nbsp; of the variables&nbsp; $u$&nbsp; and&nbsp; $v$.
 +
 
 +
 
 +
 
 +
[[File:P_ID403__Sto_A_4_7_d_neu.png|right|frame|correlation coefficient as a function of the quotient&nbsp; $B/A$]]
 +
'''(4)'''&nbsp; The correlation coefficient is obtained as
 +
:$$\rho_{xy} =\frac{\mu_{xy}}{\sigma_x \cdot \sigma_y} = \frac{(A^2 - B^2) \cdot \sigma^2}{(A^2 +B^2) \cdot \sigma^2}
 +
\hspace{0.5 cm}\Rightarrow \hspace{0.5 cm}\rho_{xy} =\frac{1 - (B/A)^2} {1 +(B/A)^2}.$$
 +
 
 +
*With&nbsp; $B/A = 2$&nbsp; it follows&nbsp; $\rho_{xy}  \hspace{0.15cm}\underline{ =-0.6}$.
 +
 
  
:Die Kovarianz ergibt sich dann zu
 
:$$\mu_{xy} = m_{xy} - m_{x} \cdot m_{y}= \\ = (A^2 - B^2)(m^2 + \sigma^2) - (A + B)(A-B) \cdot m^2  = (A^2 - B^2) \cdot \sigma^2.$$
 
  
:Mit <i>A</i> = 1, <i>B</i> = 2, <i>&sigma;</i> = 1 erh&auml;lt man <u><i>&mu;<sub>xy</sub></i> = &ndash;3</u>, unabh&auml;ngig vom Mittelwert <i>m</i> der Größen <i>u</i> und <i>&upsilon;</i>.
+
'''(5)'''&nbsp; Correct are <u>statements 1, 3, and 4</u>:
[[File:P_ID403__Sto_A_4_7_d_neu.png|right|]]
+
*From&nbsp; $B= 0$&nbsp; follows&nbsp; $\rho_{xy} = 1$&nbsp; ("strict correlation").&nbsp; It can be further seen that in this case&nbsp; $x = u$&nbsp; and&nbsp; $y = u$&nbsp; are identical random variables.
 +
*The second statement is not true: &nbsp; For&nbsp; $A = 1$&nbsp; and&nbsp; $B= -2$&nbsp; also results&nbsp; $\rho_{xy} = -0.6$.
 +
*So the sign of the quotient does not matter because in the equation calculated in subtask&nbsp; '''(4)''''&nbsp; the quotient&nbsp; $B/A$&nbsp; occurs only quadratically.
 +
*If&nbsp; $B \gg A$,&nbsp; both&nbsp; $x$&nbsp; and&nbsp; $y$&nbsp; are determined almost exclusively by the random variable&nbsp; $v$&nbsp; and it is&nbsp; $ y \approx -x$.&nbsp; This corresponds to the correlation coefficient&nbsp; $\rho_{xy} \approx -1$.  
 +
*In contrast,&nbsp; $B/A = 1$&nbsp; always yields the correlation coefficient&nbsp; $\rho_{xy} = 0$&nbsp; and thus the uncorrelatedness between&nbsp; $x$&nbsp; and&nbsp; $y$.
  
:<b>4.</b> Der Korrelationskoeffizient ergibt sich zu
 
:$$\rho_{xy} =\frac{\mu_{xy}}{\sigma_x \cdot \sigma_y} = \frac{(A^2 - B^2) \cdot \sigma^2}{(A^2 +B^2) \cdot \sigma^2} $$
 
:$$\Rightarrow \rho_{xy} =\frac{1 - (B/A)^2} {1 +(B/A)^2}.$$
 
  
:Mit <i>B</i>/<i>A</i> = 2 folgt daraus <u><i>&rho;<sub>xy</sub></i> = &ndash;0.6</u>.
 
  
:<br><br><br>
+
'''(6)'''&nbsp; <u>Both statements</u> are true:
:<b>5.</b> Aus <i>B</i> = 0 folgt <i>&rho;<sub>xy</sub></i> = 1 (strenge Korrelation). Aus den Gleichungen f&uuml;r <i>x</i> und <i>y</i> erkennt man weiter, dass in diesem Fall <i>x</i> und <i>y</i> identische Zufallsgr&ouml;&szlig;en sind.
+
*When&nbsp; $A=B$&nbsp; &rArr; &nbsp; $x$&nbsp; and&nbsp; $y$&nbsp; are always  uncorrelated&nbsp; $($for any PDF of the variables $u$&nbsp; and&nbsp; $v)$.  
 +
*The new random variables&nbsp; $x$&nbsp; and&nbsp; $y$&nbsp; are therefore also distributed randomly.
 +
*For Gaussian randomness,&nbsp; however,&nbsp; statistical independence follows from uncorrelatedness,&nbsp; and vice versa.  
  
:Die zweite Aussage ist nicht zutreffend: F&uuml;r <i>A</i> = 1 und <i>B</i> = &ndash;2 ergibt sich ebenfalls <i>&rho;<sub>xy</sub></i> = &ndash;0,6. Das Vorzeichen des Quotienten spielt also keine Rolle, weil in der unter (d) berechneten Gleichung <i>B</i>/<i>A</i> nur quadratisch auftritt.
 
  
:Ist <i>B</i> sehr viel gr&ouml;&szlig;er als <i>A</i>, so werden sowohl <i>x</i> als auch <i>y</i> fast ausschlie&szlig;lich durch die Zufallsgr&ouml;&szlig;e <i>&upsilon;</i> bestimmt und es ist <i>y</i> &asymp; &ndash;<i>x</i>. Dies entspricht dem Korrelationskoeffizienten <i>&rho;<sub>xy</sub></i> = &ndash;1. Dagegen ergibt sich f&uuml;r <i>B</i>/<i>A</i> = 1 stets der Korrelationskoeffizient <i>&rho;<sub>xy</sub></i> = 0 und damit die Unkorreliertheit zwischen <i>x</i> und <i>y</i>.
 
  
:Richtig sind somit die <u>Aussagen 1, 3 und 4</u>.
+
[[File:P_ID404__Sto_A_4_7_g.png|right|frame|Joint PDF and edge PDFs]]
 +
'''(7)'''&nbsp; Here,&nbsp; only <u>statement 1</u> is true:
 +
*The correlation coefficient results with&nbsp; $A=B= 1$&nbsp; to&nbsp; $\rho_{xy} = 0$.&nbsp; That is:&nbsp; $x$&nbsp; and&nbsp; $y$&nbsp; are uncorrelated.
 +
*But it can be seen from the sketched two-dimensional PDF that the condition of statistical independence no longer applies in the present case:
 +
:$$f_{xy}(x, y) \ne f_{x}(x) \cdot f_{y}(y).$$
  
:<b>6.</b> Bei <i>A</i> = <i>B</i> sind <i>x</i> und <i>y</i> stets (d. h. bei jeder beliebigen WDF der Gr&ouml;&szlig;en <i>u</i> und <i>&upsilon;</i>) unkorreliert. Die neuen Zufallsgr&ouml;&szlig;en <i>x</i> und <i>y</i> sind ebenfalls gau&szlig;verteilt. Bei Gau&szlig;schen Zufallsgr&ouml;&szlig;en folgt aber aus der Unkorreliertheit auch die statistische Unabh&auml;ngigkeit und umgekehrt. Also sind <u>beide Aussagen richtig</u>.
 
  
:<b>7.</b> Der Korrelationskoeffizient ergibt sich mit <i>A</i> = <i>B</i> = 1 auch hier zu <i>&rho;<sub>xy</sub></i> = 0. Das hei&szlig;t, dass <i>x</i> und <i>y</i> unkorreliert sind. Dagegen erkennt man aus der nachfolgend skizzierten 2D-WDF, dass die Bedingung der statistischen Unabh&auml;ngigkeit im nun vorliegenden Fall nicht mehr gegeben ist. Vielmehr gilt: <i>f<sub>xy</sub></i>(<i>x</i>, <i>y</i>) &ne; <i>f<sub>x</sub></i>(<i>x</i>) &middot; <i>f<sub>y</sub></i>(<i>y</i>). Hier ist also nur die <u>Aussage 1</u> zutreffend.
 
[[File:P_ID404__Sto_A_4_7_g.png|midle|]]
 
 
{{ML-Fuß}}
 
{{ML-Fuß}}
  
  
  
[[Category:Aufgaben zu Stochastische Signaltheorie|^4.3 Linearkombinationen^]]
+
[[Category:Theory of Stochastic Signals: Exercises|^4.3 Linear Combinations^]]

Latest revision as of 17:33, 25 February 2022

Sum and difference of random variables

Let the random variables  $u$  and  $v$  be statistically independent of each other, each with mean  $m$  and variance  $\sigma^2$.

  • Both variables have equal probability density function  $\rm (PDF)$  and cumulative distribution function  $\rm (CDF)$.
  • Nothing is known about the course of these functions for the time being.


Now two new random variables  $x$  and  $y$  are formed according to the following equations:

$$x = A \cdot u + B \cdot v,$$
$$y= A \cdot u - B \cdot v.$$

Here,  $A$  and  $B$  denote  (any)  constant values.

  • For the subtasks  (1)  to  (4)  let   $m= 0$,   $\sigma = 1$,   $A = 1$  and  $B = 2$.
  • In subtask  (6)  $u$  and  $v$  are each uniformly distributed with  $m= 1$  and  $\sigma = 0.5$. For the constants,  $A = B = 1$.
  • For subtask  (7)  it is still valid  $A = B = 1$.  Here the random variables  $u$  and  $v$  are symmetrically two-point distributed on  $\pm$1:
$${\rm Pr}(u=+1) = {\rm Pr}(u=-1) = {\rm Pr}(v=+1) = {\rm Pr}(v=-1) =0.5.$$



Note:  The exercise belongs to the chapter  Linear Combinations of Random Variables.



Questions

1

What is the mean and the standard deviation of  $x$  for  $A = 1$  and  $B = 2$?

$m_x \ = \ $

$\sigma_x \ = \ $

2

What is the mean and the standard deviation of  $y$  for  $A = 1$  and  $B = 2$?

$m_y \ = \ $

$\sigma_y \ = \ $

3

Calculate the covariance  $\mu_{xy}$.  What value results for  $A = 1$  and  $B = 2$?

$\mu_{xy} \ = \ $

4

Calculate the correlation coefficient  $\rho_{xy}$  as a function of the quotient  $B/A$.  What coefficient results for  $A = 1$  and  $B = 2$?

$\rho_{xy}\ = \ $

5

Which of the following statements is always true?

For  $B = 0$  the random variables  $x$  and  $y$  are strictly correlated.
It holds  $\rho_{xy}(-B/A) = -\rho_{xy}(B/A)$.
In the limiting case  $B/A \to \infty$  the random variables  $x$  and  $y$  are strictly correlated.
For  $A =B$  the random variables $x$  and  $y$  are uncorrelated.

6

Which statements are true if  $A =B = 1$  holds and  $x$  and  $y$  are each Gaussian distributed with mean  $m = 1$  and standard deviation  $\sigma = 0.5$?

The random variables $x$  and  $y$  are uncorrelated.
The random variables $x$  and  $y$  are statistically independent.

7

Which statements are true if  $x$  and  $y$  are symmetrically two-point distributed and  $A =B = 1$  holds?

The random variables $x$  and  $y$  are uncorrelated.
The random variables $x$  and  $y$  are statistically independent.


Solution

(1)  Since the random variables  $u$  and  $v$  are zero mean  $(m = 0)$,  the random variable  $x$  is also zero mean:

$$m_x = (A +B) \cdot m \hspace{0.15cm}\underline{ =0}.$$
  • For the variance and standard deviation:
$$\sigma_x^2 = (A^2 +B^2) \cdot \sigma^2 = 5; \hspace{0.5cm} \sigma_x = \sqrt{5}\hspace{0.15cm}\underline{ \approx 2.236}.$$


(2)  Since  $u$  and  $v$  have the same standard deviation,  so does  $\sigma_y =\sigma_x \hspace{0.15cm}\underline{ \approx 2.236}$.

  • Because  $m=0$  also  $m_y = m_x \hspace{0.15cm}\underline{ =0}$.
  • For mean-valued random variable  $u$  and  $v$  on the other hand,  for  $m_y = (A -B) \cdot m$  adds up to a different value than for  $m_x = (A +B) \cdot m$.


(3)  We assume here in the sample solution the more general case  $m \ne 0$.  Then,  for the common moment holds:

$$m_{xy} = {\rm E} \big[x \cdot y \big] = {\rm E} \big[(A \cdot u + B \cdot v) (A \cdot u - B \cdot v)\big] . $$
  • According to the general calculation rules for expected values,  it follows:
$$m_{xy} = A^2 \cdot {\rm E} \big[u^2 \big] - B^2 \cdot {\rm E} \big[v^2 \big] = (A^2 - B^2)(m^2 + \sigma^2).$$
  • This gives the covariance to
$$\mu_{xy} = m_{xy} - m_{x} \cdot m_{y}= (A^2 - B^2)(m^2 + \sigma^2) - (A + B)(A-B) \cdot m^2 = (A^2 - B^2) \cdot \sigma^2.$$
  • With  $\sigma = 1$,  $A = 1$  and  $B = 2$  we get  $\mu_{xy} \hspace{0.15cm}\underline{ =-3}$.  Tthis is independent of the mean  $m$  of the variables  $u$  and  $v$.


correlation coefficient as a function of the quotient  $B/A$

(4)  The correlation coefficient is obtained as

$$\rho_{xy} =\frac{\mu_{xy}}{\sigma_x \cdot \sigma_y} = \frac{(A^2 - B^2) \cdot \sigma^2}{(A^2 +B^2) \cdot \sigma^2} \hspace{0.5 cm}\Rightarrow \hspace{0.5 cm}\rho_{xy} =\frac{1 - (B/A)^2} {1 +(B/A)^2}.$$
  • With  $B/A = 2$  it follows  $\rho_{xy} \hspace{0.15cm}\underline{ =-0.6}$.


(5)  Correct are statements 1, 3, and 4:

  • From  $B= 0$  follows  $\rho_{xy} = 1$  ("strict correlation").  It can be further seen that in this case  $x = u$  and  $y = u$  are identical random variables.
  • The second statement is not true:   For  $A = 1$  and  $B= -2$  also results  $\rho_{xy} = -0.6$.
  • So the sign of the quotient does not matter because in the equation calculated in subtask  (4)'  the quotient  $B/A$  occurs only quadratically.
  • If  $B \gg A$,  both  $x$  and  $y$  are determined almost exclusively by the random variable  $v$  and it is  $ y \approx -x$.  This corresponds to the correlation coefficient  $\rho_{xy} \approx -1$.
  • In contrast,  $B/A = 1$  always yields the correlation coefficient  $\rho_{xy} = 0$  and thus the uncorrelatedness between  $x$  and  $y$.


(6)  Both statements are true:

  • When  $A=B$  ⇒   $x$  and  $y$  are always uncorrelated  $($for any PDF of the variables $u$  and  $v)$.
  • The new random variables  $x$  and  $y$  are therefore also distributed randomly.
  • For Gaussian randomness,  however,  statistical independence follows from uncorrelatedness,  and vice versa.


Joint PDF and edge PDFs

(7)  Here,  only statement 1 is true:

  • The correlation coefficient results with  $A=B= 1$  to  $\rho_{xy} = 0$.  That is:  $x$  and  $y$  are uncorrelated.
  • But it can be seen from the sketched two-dimensional PDF that the condition of statistical independence no longer applies in the present case:
$$f_{xy}(x, y) \ne f_{x}(x) \cdot f_{y}(y).$$