Difference between revisions of "Aufgaben:Exercise 4.15Z: Statements of the Covariance Matrix"
m (Guenter moved page Aufgabe 4.15Z: Aussagen der Kovarianzmatrix to Exercise 4.15Z: Statements of the Covariance Matrix) |
|||
(2 intermediate revisions by 2 users not shown) | |||
Line 1: | Line 1: | ||
− | {{quiz-Header|Buchseite= | + | {{quiz-Header|Buchseite=Generalization to N-Dimensional Random Variables |
}} | }} | ||
− | [[File:P_ID664__Sto_Z_4_15.png|right|frame| | + | [[File:P_ID664__Sto_Z_4_15.png|right|frame|Are the random signals correlated?]] |
− | + | Let be given the two Gaussian random variables $u$ and $v$, each zero mean and with variance $\sigma^2 = 1$. | |
− | + | From these, three new random variables are formed by linear combination: | |
:$$x_1 = A_1 \cdot u + B_1 \cdot v,$$ | :$$x_1 = A_1 \cdot u + B_1 \cdot v,$$ | ||
:$$x_2 = A_2 \cdot u + B_2 \cdot v,$$ | :$$x_2 = A_2 \cdot u + B_2 \cdot v,$$ | ||
:$$x_3 = A_3 \cdot u + B_3 \cdot v.$$ | :$$x_3 = A_3 \cdot u + B_3 \cdot v.$$ | ||
− | + | Assuming that in all cases considered $(i = 1,\ 2,\ 3)$ holds: | |
− | :$$A_i^2 + B_i^2 | + | :$$A_i^2 + B_i^2 =1.$$ |
− | + | The graph shows the signals $x_1(t)$, $x_2(t)$ and $x_3(t)$ for the case to be considered in the subtask '''(3)''': | |
* $A_1 = B_2 = 1$, | * $A_1 = B_2 = 1$, | ||
* $A_2 = B_2 = 0$, | * $A_2 = B_2 = 0$, | ||
− | * $A_3 = 0.8, \ B_3 = 0.6$ | + | * $A_3 = 0.8, \ B_3 = 0.6$. |
− | + | The correlation coefficient $\rho_{ij}$ between the random variables $x_i$ and $x_j$ is given as follows: | |
:$$\rho_{ij} = \frac{A_i \cdot A_j + B_i \cdot B_j}{\sqrt{(A_i^2 + | :$$\rho_{ij} = \frac{A_i \cdot A_j + B_i \cdot B_j}{\sqrt{(A_i^2 + | ||
B_i^2)(A_j^2 + B_j^2)}} = A_i \cdot A_j + B_i \cdot B_j.$$ | B_i^2)(A_j^2 + B_j^2)}} = A_i \cdot A_j + B_i \cdot B_j.$$ | ||
− | + | Under the assumption implicit here $\sigma_1^2 = \sigma_2^2 = \sigma_3^2 = 1$ the covariance matrix $\mathbf{K}$ is: | |
:$${\mathbf{K}} =\left[ K_{ij} \right] = \left[ \begin{array}{ccc} | :$${\mathbf{K}} =\left[ K_{ij} \right] = \left[ \begin{array}{ccc} | ||
− | 1 & \rho_{12} & | + | 1 & \rho_{12} & \rho_{13} \\ \rho_{12} & 1 & \rho_{23} \\ |
− | \rho_{13} & \rho_{23} & | + | \rho_{13} & \rho_{23} & 1 |
\end{array} \right] .$$ | \end{array} \right] .$$ | ||
− | + | This is identical to the correlation matrix $\mathbf{R}$ for zero mean random variables. | |
Line 36: | Line 36: | ||
− | + | Hints: | |
− | + | *The exercise belongs to the chapter [[Theory_of_Stochastic_Signals/Generalization_to_N-Dimensional_Random_Variables|Generalization to N-Dimensional Random Variables]]. | |
− | + | *Some basics on the application of vectors and matrices can be found on the pages [[Theory_of_Stochastic_Signals/Generalization_to_N-Dimensional_Random_Variables#Basics_of_matrix_operations:_Determinant_of_a_matrix|Determinant of a Matrix]] and [[Theory_of_Stochastic_Signals/Generalization_to_N-Dimensional_Random_Variables#Basics_of_matrix_operations:_Inverse_of_a_matrix|Inverse of a Matrix]] . | |
− | |||
− | |||
− | * | ||
− | * | ||
− | === | + | ===Questions=== |
<quiz display=simple> | <quiz display=simple> | ||
− | { | + | {Which of the following statements are true? Give reasons for your findings. |
|type="[]"} | |type="[]"} | ||
− | - $\mathbf{K}$ | + | - $\mathbf{K}$ can be with a suitable choice of $A_1$, ... , $B_3$ a diagonal matrix. Or in other words, $\rho_{12} = \rho_{13} = \rho_{23} = 0$ is possible. |
− | + | + | + With appropriate choice of parameters $A_1$, ... , $B_3$ exactly one of the correlation coefficients can be $\rho_{ij} = 0$. |
− | - | + | - With appropriate choice of parameters $A_1$, ... , $B_3$ exactly two of the correlation coefficients can be $\rho_{ij} = 0$. |
− | + | + | + With appropriate choice of parameters $A_1$, ... , $B_3$ all three correlation coefficients $\rho_{ij} \ne 0$. |
− | { | + | {What are the matrix elements of $\mathbf{K}$ with $A_1 = A_2 = - A_3$ and $B_1 = B_2 = - B_3$ ? |
|type="{}"} | |type="{}"} | ||
$\rho_{12} \ = \ $ { 1 3% } | $\rho_{12} \ = \ $ { 1 3% } | ||
Line 65: | Line 61: | ||
− | { | + | {Calculate the coefficients $\rho_{ij}$ for the case shown in the graph: $A_1 = 1$, $B_1 = 0$, $A_2 = 0$, $B_2 = 1$, $A_3 = 0.8$, $B_3 = 0.6$. |
|type="{}"} | |type="{}"} | ||
− | $\rho_{12} \ = | + | $\rho_{12} \ = \ $ { 0. } |
− | $\rho_{13} \ = | + | $\rho_{13} \ = \ ${ 0.8 3% } |
− | $\rho_{23} \ = | + | $\rho_{23} \ = \ $ { 0.6 3% } |
− | |||
</quiz> | </quiz> | ||
− | === | + | ===Solution=== |
{{ML-Kopf}} | {{ML-Kopf}} | ||
− | '''(1)''' | + | '''(1)''' Only the <u>second and the last statement</u> are true: |
− | * | + | *Statement 2 describes the case considered in the graph where two quantities $($here: $x_1$ and $x_2)$ are uncorrelated, while $x_3$ has statistical bindings with respect to $x_1$ $($about the quantity $u)$ and also with respect to $x_2$ $($due to the random variable $v)$ . |
− | * | + | *On the other hand, the combination $\rho_{12} = \rho_{13} = \rho_{23} = 0$ is not possible with the structure given here. For this, one would need a third statistically independent random variable $w$ and, for example, $x_1 = k_1 \cdot u$ , $x_2 = k_2 \cdot v$ and $x_3 = k_3 \cdot w$ would have to hold. |
− | * | + | *The third statement is not true: If $x_1$ and $x_2$ are uncorrelated and at the same time also $x_1$ and $x_3$, then no statistical bindings can exist between $x_2$ and $x_3$. |
− | * | + | *In general, however, both $\rho_{12}$ and $\rho_{13}$ will be different from zero. A very simple example of this is considered in the subtask '''(2)''' . |
− | |||
− | '''(2)''' In | + | '''(2)''' In this case, the quantities $x_1 = x_2$ are completely $($to $100\%)$ correlated. |
− | * | + | *With $A_2 = A_1$ and $B_2 = B_1$ we obtain for the joint correlation coefficient: |
− | :$$\rho_{12} = | + | :$$\rho_{12} = A_1 \cdot A_2 + B_1 \cdot B_2 = A_1^2 + B_1^2 \hspace{0.15cm}\underline{=1}.$$ |
− | *In | + | *In the same way, with $A_3 = -A_1$ and $B_3 = -B_1$: |
− | :$$\rho_{13} = | + | :$$\rho_{13} = A_1 \cdot A_3 + B_1 \cdot B_3 = -(A_1^2 + B_1^2) \hspace{0.15cm}\underline{=-1 |
\hspace{0.1cm}(= \rho_{23})}.$$ | \hspace{0.1cm}(= \rho_{23})}.$$ | ||
− | '''(3)''' | + | '''(3)''' With this parameter set, $x_1$ is identical to the random variable $u$, while $x_2 = v$ holds. |
− | * | + | *Since $u$ and $v$ are statistically independent of each other, we get $\rho_{12} \hspace{0.15cm}\underline{ = 0}.$ |
− | * | + | *In contrast, for the other two correlation coefficients: |
− | :$$\rho_{13} = | + | :$$\rho_{13} = A_1 \cdot A_3 + B_1 \cdot B_3 = 1 \cdot 0.8 + 0 \cdot |
0.6 \hspace{0.15cm}\underline{ = 0.8},$$ | 0.6 \hspace{0.15cm}\underline{ = 0.8},$$ | ||
− | :$$\rho_{23} = | + | :$$\rho_{23} = A_2 \cdot A_3 + B_2 \cdot B_3 = 0 \cdot 0.8 + 1 \cdot |
0.6 \hspace{0.15cm}\underline{ = 0.6}.$$ | 0.6 \hspace{0.15cm}\underline{ = 0.6}.$$ | ||
− | + | *For a (very well) trained eye, it can be seen from the graph on the information page that the signal $x_3(t)$ has more similarities with $x_1(t)$ than with $x_2(t)$. | |
− | * | + | *This fact is also expressed by the calculated correlation coefficients. |
− | * | + | *Don't be frustrated, however, if you don't recognize the different correlation in the signal courses. |
− | * | ||
{{ML-Fuß}} | {{ML-Fuß}} |
Latest revision as of 13:46, 29 March 2022
Let be given the two Gaussian random variables $u$ and $v$, each zero mean and with variance $\sigma^2 = 1$.
From these, three new random variables are formed by linear combination:
- $$x_1 = A_1 \cdot u + B_1 \cdot v,$$
- $$x_2 = A_2 \cdot u + B_2 \cdot v,$$
- $$x_3 = A_3 \cdot u + B_3 \cdot v.$$
Assuming that in all cases considered $(i = 1,\ 2,\ 3)$ holds:
- $$A_i^2 + B_i^2 =1.$$
The graph shows the signals $x_1(t)$, $x_2(t)$ and $x_3(t)$ for the case to be considered in the subtask (3):
- $A_1 = B_2 = 1$,
- $A_2 = B_2 = 0$,
- $A_3 = 0.8, \ B_3 = 0.6$.
The correlation coefficient $\rho_{ij}$ between the random variables $x_i$ and $x_j$ is given as follows:
- $$\rho_{ij} = \frac{A_i \cdot A_j + B_i \cdot B_j}{\sqrt{(A_i^2 + B_i^2)(A_j^2 + B_j^2)}} = A_i \cdot A_j + B_i \cdot B_j.$$
Under the assumption implicit here $\sigma_1^2 = \sigma_2^2 = \sigma_3^2 = 1$ the covariance matrix $\mathbf{K}$ is:
- $${\mathbf{K}} =\left[ K_{ij} \right] = \left[ \begin{array}{ccc} 1 & \rho_{12} & \rho_{13} \\ \rho_{12} & 1 & \rho_{23} \\ \rho_{13} & \rho_{23} & 1 \end{array} \right] .$$
This is identical to the correlation matrix $\mathbf{R}$ for zero mean random variables.
Hints:
- The exercise belongs to the chapter Generalization to N-Dimensional Random Variables.
- Some basics on the application of vectors and matrices can be found on the pages Determinant of a Matrix and Inverse of a Matrix .
Questions
Solution
- Statement 2 describes the case considered in the graph where two quantities $($here: $x_1$ and $x_2)$ are uncorrelated, while $x_3$ has statistical bindings with respect to $x_1$ $($about the quantity $u)$ and also with respect to $x_2$ $($due to the random variable $v)$ .
- On the other hand, the combination $\rho_{12} = \rho_{13} = \rho_{23} = 0$ is not possible with the structure given here. For this, one would need a third statistically independent random variable $w$ and, for example, $x_1 = k_1 \cdot u$ , $x_2 = k_2 \cdot v$ and $x_3 = k_3 \cdot w$ would have to hold.
- The third statement is not true: If $x_1$ and $x_2$ are uncorrelated and at the same time also $x_1$ and $x_3$, then no statistical bindings can exist between $x_2$ and $x_3$.
- In general, however, both $\rho_{12}$ and $\rho_{13}$ will be different from zero. A very simple example of this is considered in the subtask (2) .
(2) In this case, the quantities $x_1 = x_2$ are completely $($to $100\%)$ correlated.
- With $A_2 = A_1$ and $B_2 = B_1$ we obtain for the joint correlation coefficient:
- $$\rho_{12} = A_1 \cdot A_2 + B_1 \cdot B_2 = A_1^2 + B_1^2 \hspace{0.15cm}\underline{=1}.$$
- In the same way, with $A_3 = -A_1$ and $B_3 = -B_1$:
- $$\rho_{13} = A_1 \cdot A_3 + B_1 \cdot B_3 = -(A_1^2 + B_1^2) \hspace{0.15cm}\underline{=-1 \hspace{0.1cm}(= \rho_{23})}.$$
(3) With this parameter set, $x_1$ is identical to the random variable $u$, while $x_2 = v$ holds.
- Since $u$ and $v$ are statistically independent of each other, we get $\rho_{12} \hspace{0.15cm}\underline{ = 0}.$
- In contrast, for the other two correlation coefficients:
- $$\rho_{13} = A_1 \cdot A_3 + B_1 \cdot B_3 = 1 \cdot 0.8 + 0 \cdot 0.6 \hspace{0.15cm}\underline{ = 0.8},$$
- $$\rho_{23} = A_2 \cdot A_3 + B_2 \cdot B_3 = 0 \cdot 0.8 + 1 \cdot 0.6 \hspace{0.15cm}\underline{ = 0.6}.$$
- For a (very well) trained eye, it can be seen from the graph on the information page that the signal $x_3(t)$ has more similarities with $x_1(t)$ than with $x_2(t)$.
- This fact is also expressed by the calculated correlation coefficients.
- Don't be frustrated, however, if you don't recognize the different correlation in the signal courses.