Difference between revisions of "Theory of Stochastic Signals/Statistical Dependence and Independence"

From LNTwww
Line 72: Line 72:
 
{{GraueBox|TEXT=   
 
{{GraueBox|TEXT=   
 
$\text{Example 2:}$ 
 
$\text{Example 2:}$ 
Wir betrachten wieder das Zufallsexperiment "Werfen mit zwei Würfeln", wobei wie im  [[Theory_of_Stochastic_Signals/Statistische_Abhängigkeit_und_Unabhängigkeit#Allgemeine_Definition_von_statistischer_Abh.C3.A4ngigkeit|$\text{Example 1}$]]  $S = R + B$  die Summe des roten und des blauen Würfels bezeichnet.  
+
We again consider the experiment "Throwing two dice", where, as in  [[Theory_of_Stochastic_Signals/Statistical_Dependence_and_Independence#General_Definition_of_Statistical_Dependence|$\text{Example 1}$]]  $S = R + B$  denotes the sum of the red and blue dice.
  
Wir betrachten hier Bindungen zwischen den beiden Ereignissen
+
Here we consider ties between the two events
*$A_1$:&nbsp; Die Augenzahl des roten Würfels ist&nbsp; $R < 4$&nbsp; (rote Hinterlegung) &nbsp; &rArr; &nbsp; ${\rm Pr}(A_1) = 1/2$,
+
*$A_1$:&nbsp; The roll of the red die is&nbsp; $R < 4$&nbsp; (red background) &nbsp; &rArr; &nbsp; ${\rm Pr}(A_1) = 1/2$,
*$A_4$:&nbsp; Die Summe der beiden Würfel ist&nbsp; $S = 8$&nbsp; (grüne Umrahmung)  &nbsp; &rArr; &nbsp; ${\rm Pr}(A_4) = 5/36$,
+
*$A_4$:&nbsp; The sum of the two dice is&nbsp; $S = 8$&nbsp; (green outline)  &nbsp; &rArr; &nbsp; ${\rm Pr}(A_4) = 5/36$,
  
  
und nehmen nochmals Bezug auf das Ereignis
+
and refer again to the event
*$A_3$:&nbsp; Die Summe der beiden Würfel ist&nbsp; $S = 7$ &nbsp; &rArr; &nbsp; ${\rm Pr}(A_3) = 1/6$.
+
*$A_3$:&nbsp; The sum of the two cubes is&nbsp; $S = 7$ &nbsp; &rArr; &nbsp; ${\rm Pr}(A_3) = 1/6$.
 
<br clear=all>
 
<br clear=all>
Zu dieser Grafik ist anzumerken:  
+
Regarding this graph, note:
*Zwischen den Ereignissen&nbsp; $A_1$&nbsp; und&nbsp; $A_4$&nbsp; bestehen statistische Bindungen, da die Wahrscheinlichkeit der Schnittmenge &nbsp; &rArr; &nbsp; ${\rm Pr}(A_1 ∩ A_4) = 2/36 = 4/72$&nbsp; ungleich dem Produkt&nbsp; ${\rm Pr}(A_1) \cdot {\rm Pr}(A_4)= 1/2 \cdot 5/36 = 5/72$&nbsp; ist.  
+
*There are statistical ties between events&nbsp; $A_1$&nbsp; and&nbsp; $A_4$&nbsp;, since the probability of intersection &nbsp; &rArr; &nbsp; ${\rm Pr}(A_1 ∩ A_4) = 2/36 = 4/72$&nbsp; is not equal to the product&nbsp; ${\rm Pr}(A_1) \cdot {\rm Pr}(A_4)= 1/2 \cdot 5/36 = 5/72$&nbsp;.   *The conditional probability&nbsp; ${\rm Pr}(A_1 \hspace{0.05cm} \vert \hspace{0.05cm} A_4) = 2/5$&nbsp; can be calculated from the quotient of the joint probability&nbsp; ${\rm Pr}(A_1 ∩ A_4) = 2/36$&nbsp; and the probability &nbsp; ${\rm Pr}(A_4) = 5/36$&nbsp;.
*Die bedingte Wahrscheinlichkeit&nbsp; ${\rm Pr}(A_1 \hspace{0.05cm} \vert \hspace{0.05cm} A_4) = 2/5$&nbsp; kann aus dem Quotienten der Verbundwahrscheinlichkeit&nbsp; ${\rm Pr}(A_1 ∩ A_4) = 2/36$&nbsp; und der Wahrscheinlichkeit&nbsp; ${\rm Pr}(A_4) = 5/36$&nbsp; berechnet werden.  
+
*Since&nbsp; $A_1$&nbsp; and&nbsp; $A_4$&nbsp; are statistically dependent, the conditional probability&nbsp; ${\rm Pr}(A_1 \hspace{0.05cm}\vert \hspace{0.05cm} A_4) = 2/5$&nbsp;  (two of the five squares outlined in green are highlighted in red)&nbsp; is not equal to the absolute probability&nbsp; ${\rm Pr}(A_1) = 1/2$&nbsp; (half of all squares are highlighted in red).  
*Da&nbsp; $A_1$&nbsp; und&nbsp; $A_4$&nbsp; statistisch abhängig sind, ist die bedingte Wahrscheinlichkeit&nbsp; ${\rm Pr}(A_1 \hspace{0.05cm}\vert \hspace{0.05cm} A_4) = 2/5$&nbsp;  (zwei der fünf grün umrandeten Felder sind rot hinterlegt)&nbsp; ungleich der absoluten Wahrscheinlichkeit&nbsp; ${\rm Pr}(A_1) = 1/2$&nbsp; (die Hälfte aller Felder sind rot hinterlegt).  
+
*Similarly, the conditional probability&nbsp; ${\rm Pr}(A_4 \hspace{0.05cm} \vert \hspace{0.05cm} A_1) = 2/18 = 4/36$&nbsp;  (two of the&nbsp; $18$&nbsp; fields with a red background are outlined in green) is unequal to the absolute probability&nbsp; ${\rm Pr}(A_4) = 5/36$&nbsp; (a total of five of the&nbsp; $36$&nbsp; fields are outlined in green).  
*Ebenso ist die bedingte Wahrscheinlichkeit&nbsp; ${\rm Pr}(A_4 \hspace{0.05cm} \vert \hspace{0.05cm} A_1) = 2/18 = 4/36$&nbsp;  (zwei der&nbsp; $18$&nbsp; rot hinterlegten Felder sind grün umrandet) ungleich der absoluten Wahrscheinlichkeit&nbsp; ${\rm Pr}(A_4) = 5/36$&nbsp; (insgesamt sind fünf der&nbsp; $36$&nbsp; Felder grün umrandet).  
+
*This last result can also be derived using&nbsp; '''Bayes' theorem'''&nbsp;, for example:
*Dieses letzte Ergebnis lässt sich zum Beispiel auch über den&nbsp; '''Satz von Bayes'''&nbsp; ableiten:
 
 
:$${\rm Pr}(A_4 \hspace{0.05cm} \vert\hspace{0.05cm} A_1) =  \frac{ {\rm Pr}(A_1 \hspace{0.05cm} \vert\hspace{0.05cm} A_4)\cdot {\rm Pr} ( A_4)} {  {\rm Pr}(A_1)}  = \frac{2/5 \cdot 5/36}{1/2}  = 1/9.$$
 
:$${\rm Pr}(A_4 \hspace{0.05cm} \vert\hspace{0.05cm} A_1) =  \frac{ {\rm Pr}(A_1 \hspace{0.05cm} \vert\hspace{0.05cm} A_4)\cdot {\rm Pr} ( A_4)} {  {\rm Pr}(A_1)}  = \frac{2/5 \cdot 5/36}{1/2}  = 1/9.$$
*Dagegen gelten für&nbsp; $A_1$&nbsp; und das hierzu statistisch unabhängige Ereignis&nbsp; $A_3$&nbsp; die folgenden bedingten Wahrscheinlichkeiten, siehe&nbsp; [[Theory_of_Stochastic_Signals/Statistische_Abhängigkeit_und_Unabhängigkeit#Allgemeine_Definition_von_statistischer_Abh.C3.A4ngigkeit| Beispiel 1]]:
+
*In contrast, the following conditional probabilities hold for&nbsp; $A_1$&nbsp; and the statistically independent event&nbsp; $A_3$&nbsp; , see&nbsp; [[Theory_of_Stochastic_Signals/Statistische_Abhängigkeit_und_Unabhängigkeit#General_Definition_of_Statistical_Dependence| Example 1]]:
 
:$${\rm Pr}(A_{\rm 1} \hspace{0.05cm}\vert \hspace{0.05cm} A_{\rm 3}) = {\rm Pr}(A_{\rm 1}) = \rm 1/2\hspace{0.5cm}{\rm bzw.}\hspace{0.5cm}{\rm Pr}(A_{\rm 3} \hspace{0.05cm} \vert \hspace{0.05cm} A_{\rm 1}) = {\rm Pr}(A_{\rm 3}) = 1/6.$$}}
 
:$${\rm Pr}(A_{\rm 1} \hspace{0.05cm}\vert \hspace{0.05cm} A_{\rm 3}) = {\rm Pr}(A_{\rm 1}) = \rm 1/2\hspace{0.5cm}{\rm bzw.}\hspace{0.5cm}{\rm Pr}(A_{\rm 3} \hspace{0.05cm} \vert \hspace{0.05cm} A_{\rm 1}) = {\rm Pr}(A_{\rm 3}) = 1/6.$$}}
  

Revision as of 22:50, 26 November 2021

General Definition of Statistical Dependence


So far we have not paid much attention to  statistical dependence  between events, even though we have already used it as in the case of two disjoint sets:   If an element belongs to  $A$, it cannot with certainty also be contained in the disjoint set  $B$ .

The strongest form of dependence at all is such a  deterministic dependence  between two sets or two events.  Less pronounced is the statistical dependence. Let us start with its complement:

$\text{Definition:}$  Two events  $A$  and  $B$  are called  statistically independent , if the probability of the intersection  $A ∩ B$  is equal to the product of the individual probabilities:

$${\rm Pr}(A \cap B) = {\rm Pr}(A)\cdot {\rm Pr}(B).$$


  • In some applications, statistical independence is obvious, for example, in the "coin toss" experiment. The probability for "heads" or "tails" is independent of whether  heads  oder  tails  occurred in the last toss.
  • And also the individual results in the random experiment "throwing a roulette ball" are always statistically independent of each other under fair conditions, even if individual system players do not want to admit this.
  • In other applications, on the other hand, the question whether two events are statistically independent or not is not or only very difficult to answer instinctively.  Here one can only arrive at the correct answer by checking the formal independence criterion given above, as the following example will show.


Examples for statistically independent events

$\text{Example 1:}$  We consider the experiment "throwing two dice", where the two dice can be distinguished by their colors red  $(R)$  and blue  $(B)$ .

The graph illustrates this fact, where the sum  $S = R + B$  is entered in the two-dimensional field  $(R, B)$ .

For the following description we define the following events:

  • $A_1$:  The number of eyes of the red die is  $R < 4$  (red background)   ⇒   ${\rm Pr}(A_1) = 1/2$,
  • $A_2$:  The number of eyes of the blue die is  $B > 4$  (blue font)   ⇒   ${\rm Pr}(A_2) = 1/3$,
  • $A_3$:  The sum of the two dice is  $S = 7$  (green outline)   ⇒   ${\rm Pr}(A_3) = 1/6$,
  • $A_4$:  The sum of the two dice is  $S = 8$    ⇒   ${\rm Pr}(A_4) = 5/36$,
  • $A_5$:  The sum of the two dice is  $S = 10$    ⇒   ${\rm Pr}(A_5) = 3/36$.


The graph can be interpreted as follows:

  • The two events  $A_1$  and  $A_2$  are statistically independent because the probability  ${\rm Pr}(A_1 ∩ A_2) = 1/6$  of the intersection is equal to the product of the two individual probabilities  ${\rm Pr}(A_1) = 1/2$  and  ${\rm Pr}(A_2) = 1/3$ .  Given the problem definition, any other result would also have been very surprising.
  • But also the events  $A_1$  and  $A_3$  are statistically independent because of  ${\rm Pr}(A_1) = 1/2$,  ${\rm Pr}(A_3) = 1/6$  and  ${\rm Pr}(A_1 ∩ A_3) = 1/12$  statistisch unabhängig.  The probability of intersection  $(1/12)$  arises because three of the  $36$  squares are both highlighted in red and outlined in green.
  • In contrast, there are statistical ties between events  $A_1$  and  $A_4$  because the probability of intersection   ⇒   ${\rm Pr}(A_1 ∩ A_4) = 1/18 = 4/72$  is not equal to the product  ${\rm Pr}(A_1) \cdot {\rm Pr}(A_4)= 1/2 \cdot 5/36 = 5/72$ .
  • The two events  $A_1$  and  $A_5$  are even disjunctive   ⇒   ${\rm Pr}(A_1 ∩ A_5) = 0$:   none of the boxes with red background is labeled  $S=10$ . This example shows that disjunctivity is a particularly pronounced form of statistical dependence.

Conditional Probability


If there are statistical ties between the two events  $A$  and  $B$ , the (unconditional) probabilities  ${\rm Pr}(A)$  and  ${\rm Pr}(B)$  do not describe the situation unambiguously in the statistical sense.  So-called conditional probabilities are then required.

$\text{Definitions:}$  The  conditional probability of  $A$  under condition  $B$  can be calculated as follows:

$${\rm Pr}(A\hspace{0.05cm} \vert \hspace{0.05cm} B) = \frac{ {\rm Pr}(A \cap B)}{ {\rm Pr}(B)}.$$

Similarly, the conditional probability of  $B$  under condition  $A$ is:

$${\rm Pr}(B\hspace{0.05cm} \vert \hspace{0.05cm}A) = \frac{ {\rm Pr}(A \cap B)}{ {\rm Pr}(A)}.$$

Combining these two equations, we get  Bayes' Theorem:

$${\rm Pr}(B \hspace{0.05cm} \vert \hspace{0.05cm} A) = \frac{ {\rm Pr}(A\hspace{0.05cm} \vert \hspace{0.05cm} B)\cdot {\rm Pr}(B)}{ {\rm Pr}(A)}.$$


Below are some properties of conditional probabilities:

  • A conditional probability also always lies between  $0$  and  $1$  including these two limits:   $0 \le {\rm Pr}(A \hspace{0.05cm} | \hspace{0.05cm} B) \le 1$.
  • If the condition  $B$  can be regarded as constant, all calculation rules given in the chapter  Set Theory Basics  for the unconditional probabilities  ${\rm Pr}(A)$  and  ${\rm Pr}(B)$  still apply.
  • If the existing events  $A$  and  $B$  are disjoint, then  ${\rm Pr}(A\hspace{0.05cm} | \hspace{0.05cm} B) = {\rm Pr}(B\hspace{0.05cm} | \hspace{0.05cm}A)= 0$.
  • If  $B$  is a real or fake subset of  $A$, then  ${\rm Pr}(A \hspace{0.05cm} | \hspace{0.05cm} B) =1$.  
  • If two events  $A$  and  $B$ are statistically independent, their conditional probabilities are equal to the unconditional ones, as the following calculation shows:
$${\rm Pr}(A \hspace{0.05cm} | \hspace{0.05cm} B) = \frac{{\rm Pr}(A \cap B)}{{\rm Pr}(B)} = \frac{{\rm Pr} ( A) \cdot {\rm Pr} ( B)} { {\rm Pr}(B)} = {\rm Pr} ( A).$$
Example of statistically dependent events

$\text{Example 2:}$  We again consider the experiment "Throwing two dice", where, as in  $\text{Example 1}$  $S = R + B$  denotes the sum of the red and blue dice.

Here we consider ties between the two events

  • $A_1$:  The roll of the red die is  $R < 4$  (red background)   ⇒   ${\rm Pr}(A_1) = 1/2$,
  • $A_4$:  The sum of the two dice is  $S = 8$  (green outline)   ⇒   ${\rm Pr}(A_4) = 5/36$,


and refer again to the event

  • $A_3$:  The sum of the two cubes is  $S = 7$   ⇒   ${\rm Pr}(A_3) = 1/6$.


Regarding this graph, note:

  • There are statistical ties between events  $A_1$  and  $A_4$ , since the probability of intersection   ⇒   ${\rm Pr}(A_1 ∩ A_4) = 2/36 = 4/72$  is not equal to the product  ${\rm Pr}(A_1) \cdot {\rm Pr}(A_4)= 1/2 \cdot 5/36 = 5/72$ . *The conditional probability  ${\rm Pr}(A_1 \hspace{0.05cm} \vert \hspace{0.05cm} A_4) = 2/5$  can be calculated from the quotient of the joint probability  ${\rm Pr}(A_1 ∩ A_4) = 2/36$  and the probability   ${\rm Pr}(A_4) = 5/36$ .
  • Since  $A_1$  and  $A_4$  are statistically dependent, the conditional probability  ${\rm Pr}(A_1 \hspace{0.05cm}\vert \hspace{0.05cm} A_4) = 2/5$  (two of the five squares outlined in green are highlighted in red)  is not equal to the absolute probability  ${\rm Pr}(A_1) = 1/2$  (half of all squares are highlighted in red).
  • Similarly, the conditional probability  ${\rm Pr}(A_4 \hspace{0.05cm} \vert \hspace{0.05cm} A_1) = 2/18 = 4/36$  (two of the  $18$  fields with a red background are outlined in green) is unequal to the absolute probability  ${\rm Pr}(A_4) = 5/36$  (a total of five of the  $36$  fields are outlined in green).
  • This last result can also be derived using  Bayes' theorem , for example:
$${\rm Pr}(A_4 \hspace{0.05cm} \vert\hspace{0.05cm} A_1) = \frac{ {\rm Pr}(A_1 \hspace{0.05cm} \vert\hspace{0.05cm} A_4)\cdot {\rm Pr} ( A_4)} { {\rm Pr}(A_1)} = \frac{2/5 \cdot 5/36}{1/2} = 1/9.$$
  • In contrast, the following conditional probabilities hold for  $A_1$  and the statistically independent event  $A_3$  , see  Example 1:
$${\rm Pr}(A_{\rm 1} \hspace{0.05cm}\vert \hspace{0.05cm} A_{\rm 3}) = {\rm Pr}(A_{\rm 1}) = \rm 1/2\hspace{0.5cm}{\rm bzw.}\hspace{0.5cm}{\rm Pr}(A_{\rm 3} \hspace{0.05cm} \vert \hspace{0.05cm} A_{\rm 1}) = {\rm Pr}(A_{\rm 3}) = 1/6.$$


Allgemeines Multiplikationstheorem


Wir betrachten mehrere Ereignisse, die als  $A_i$  mit  $1 ≤ i ≤ I$  bezeichnet werden.  Diese Ereignisse  $A_i$  stellen nun aber kein  vollständiges System  mehr dar, das heißt,

  • sie sind nicht paarweise zueinander disjunkt, und
  • es können zwischen den einzelnen Ereignissen auch statistische Bindungen bestehen.


$\text{Definition:}$  Für die so genannte  Verbundwahrscheinlichkeit, also für die Wahrscheinlichkeit der Schnittmenge aller  $I$  Ereignisse  $A_i$, gilt in diesem Fall:

$${\rm Pr}(A_{\rm 1} \cap \hspace{0.02cm}\text{ ...}\hspace{0.1cm} \cap A_{I}) = {\rm Pr}(A_{I})\hspace{0.05cm}\cdot\hspace{0.05cm}{\rm Pr}(A_{I \rm -1} \hspace{0.05cm}\vert \hspace{0.05cm} A_I) \hspace{0.05cm}\cdot \hspace{0.05cm}{\rm Pr}(A_{I \rm -2} \hspace{0.05cm}\vert\hspace{0.05cm} A_{I - \rm 1}\cap A_I)\hspace{0.05cm} \cdot \hspace{0.02cm}\text{ ...} \hspace{0.1cm} \cdot\hspace{0.05cm} {\rm Pr}(A_{\rm 1} \hspace{0.05cm}\vert \hspace{0.05cm}A_{\rm 2} \cap \hspace{0.02cm}\text{ ...} \hspace{0.1cm}\cap A_{ I}).$$

In gleicher Weise gilt natürlich auch:

$${\rm Pr}(A_{\rm 1} \cap \hspace{0.02cm}\text{ ...}\hspace{0.1cm} \cap A_{I}) = {\rm Pr}(A_1)\hspace{0.05cm}\cdot\hspace{0.05cm}{\rm Pr}(A_2 \hspace{0.05cm}\vert \hspace{0.05cm} A_1) \hspace{0.05cm}\cdot \hspace{0.05cm}{\rm Pr}(A_3 \hspace{0.05cm}\vert \hspace{0.05cm} A_1\cap A_2)\hspace{0.05cm} \cdot \hspace{0.02cm}\text{ ...}\hspace{0.1cm} \cdot\hspace{0.05cm} {\rm Pr}(A_I \hspace{0.05cm}\vert \hspace{0.05cm}A_1 \cap \hspace{0.02cm} \text{ ...} \hspace{0.1cm}\cap A_{ I-1}).$$


$\text{Beispiel 3:}$  Eine Lostrommel enthält zehn Lose, darunter drei Treffer  $($Ereignis $T_1)$.  Dann gilt für die Wahrscheinlichkeit, dass man mit zwei Losen zwei Treffer zieht:

$${\rm Pr}(T_1 \cap T_2) = {\rm Pr}(T_1) \cdot {\rm Pr}(T_2 \hspace{0.05cm }\vert \hspace{0.05cm} T_1) = 3/10 \cdot 2/9 = 1/15 \approx 6.7 \%.$$
  • Hierbei ist berücksichtigt, dass sich bei der zweiten Ziehung  $($Ereignis $T_2)$  nur mehr neun Lose und zwei Treffer in der Urne befänden, falls im ersten Durchgang ein Treffer gezogen worden ist   ⇒   ${\rm Pr}(T_2 \hspace{0.05cm} \vert\hspace{0.05cm} T_1) = 2/9$ .
  • Würde man jedoch die Lose nach der Ziehung wieder in die Trommel zurücklegen, so wären die Ereignisse  $T_1$  und  $T_2$  statistisch unabhängig und es würde gelten:
$$ {\rm Pr}(T_1 ∩ T_2) = (3/10)^2 = 9\%.$$

Rückschlusswahrscheinlichkeit


Gegeben seien wieder Ereignisse  $A_i$  mit  $1 ≤ i ≤ I$, die ein vollständiges System bilden. Das heißt:

  • Alle Ereignisse sind paarweise disjunkt  $(A_i ∩ A_j = ϕ$  für alle  $i ≠ j$ ).
  • Die Vereinigungsmenge ergibt die Grundmenge:
$$\rm \bigcup_{\it i=1}^{\it I}\it A_i = \it G.$$

Daneben betrachten wir noch das Ereignis  $B$, von dem alle bedingten Wahrscheinlichkeiten  ${\rm Pr}(B \hspace{0.05cm} | \hspace{0.05cm} A_i)$  mit den Indizes  $1 ≤ i ≤ I$  bekannt sind.

$\text{Satz von der totalen Wahrscheinlichkeit:}$  Unter den oben genannten Voraussetzungen gilt für die (unbedingte) Wahrscheinlichkeit des Ereignisses  $B$:

$${\rm Pr}(B) = \sum_{i={\rm1} }^{I}{\rm Pr}(B \cap A_i) = \sum_{i={\rm1} }^{I}{\rm Pr}(B \hspace{0.05cm} \vert\hspace{0.05cm} A_i)\cdot{\rm Pr}(A_i).$$


$\text{Definition:}$  Aus dieser Gleichung folgt mit dem  Satz von Bayes  für die  Rückschlusswahrscheinlichkeit:

$${\rm Pr}(A_i \hspace{0.05cm} \vert \hspace{0.05cm} B) = \frac{ {\rm Pr}( B \mid A_i)\cdot {\rm Pr}(A_i )}{ {\rm Pr}(B)} = \frac{ {\rm Pr}(B \hspace{0.05cm} \vert \hspace{0.05cm} A_i)\cdot {\rm Pr}(A_i )}{\sum_{k={\rm1} }^{I}{\rm Pr}(B \hspace{0.05cm} \vert \hspace{0.05cm} A_k)\cdot{\rm Pr}(A_k) }.$$


$\text{Beispiel 4:}$  In Münchner Studentenheimen wohnen Studierende

  • der Ludwig–Maximilian–Universität  $($Ereignis  $L$   ⇒   ${\rm Pr}(L) = 70\%)$  und
  • der Technischen Universität München  $($Ereignis  $T$   ⇒   ${\rm Pr}(T) = 30\%)$.


Es ist weiterhin bekannt, dass an der LMU  $60\%$  aller Studierenden weiblich sind, an der TUM nur  $10\%$.

  • Der Anteil aller Studentinnen im Studentenheim  $($Ereignis $W)$  kann dann mit dem Satz von der totalen Wahrscheinlichkeit ermittelt werden:
$${\rm Pr}(W) = {\rm Pr}(W \hspace{0.05cm} \vert \hspace{0.05cm} L)\hspace{0.01cm}\cdot\hspace{0.01cm}{\rm Pr}(L) \hspace{0.05cm}+\hspace{0.05cm} {\rm Pr}(W \hspace{0.05cm} \vert \hspace{0.05cm} T)\hspace{0.01cm}\cdot\hspace{0.01cm}{\rm Pr}(T) = \rm 0.6\hspace{0.01cm}\cdot\hspace{0.01cm}0.7\hspace{0.05cm}+\hspace{0.05cm}0.1\hspace{0.01cm}\cdot \hspace{0.01cm}0.3 = 45 \%.$$
  • Trifft man eine Studentin, so kann man mit der Rückschlusswahrscheinlichkeit
$${\rm Pr}(L \hspace{-0.05cm}\mid \hspace{-0.05cm}W) = \frac{ {\rm Pr}(W \hspace{-0.05cm}\mid \hspace{-0.05cm}L)\cdot {\rm Pr}(L) }{ {\rm Pr}(W \hspace{-0.05cm}\mid \hspace{-0.05cm}L) \cdot {\rm Pr}(L) +{\rm Pr}(W \hspace{-0.05cm}\mid \hspace{-0.05cm}T) \cdot {\rm Pr}(T)}=\rm \frac{0.6\cdot 0.7}{0.6\cdot 0.7 + 0.1\cdot 0.3}=\frac{14}{15}\approx 93.3 \%$$
vorhersagen,  dass sie an der LMU studieren wird. Ein durchaus realistisches Ergebnis  (zumindest in der Vergangenheit).


Die Aussagen dieses Abschnitts sind im Lernvideo  Statistische Abhängigkeit und Unabhängigkeit  zusammengefasst.

Aufgaben zum Kapitel


Aufgabe 1.4: 2S/3E-Kanalmodell

Aufgabe 1.4Z: Summe von Ternärgrößen

Aufgabe 1.5: Karten ziehen

Aufgabe 1.5Z: Ausfallwahrscheinlichkeiten