- [[Theory of Stochastic Signals/{{{Vorherige Seite}}} | Previous page]]
- [[Theory of Stochastic Signals/{{{Vorherige Seite}}} | Previous page]]
Contents
# OVERVIEW OF THE FIRST MAIN CHAPTER #
This first chapter brings a brief summary of »probability calculation«, which surely many of you already know from your school days and which is an important prerequisite for understanding the chapters that follow.
This chapter includes
- some »definitions« such as »random experiment« , »outcome« , » event« , and »probability« ,
- the »set-theoretical basics« relevant for probability theory,
- the clarification of »statistical dependence« and »statistical independence«,
- the mathematical treatment of statistical dependence by »Markov chains«.
Experiment and outcome
The starting point of any statistical investigation is a »random experiment«. By this, one understands
- an experiment that can be repeated as often as desired under the same conditions with an uncertain »outcome« $($German: "$\rm E\hspace{0.02cm}$rgebnis"$)$ $E$,
- in which, however, the quantity $ \{E_μ \}$ of the possible outcomes is specifiable.
$\text{Definition:}$ The number of possible outcomes is called the »outcome set size« $M$. Then holds:
- $$E_\mu \in G = \{E_\mu\}= \{E_1, \hspace{0.1cm}\text{...} \hspace{0.1cm}, E_M \} .$$
- The variable $μ$ can take all integer values between $1$ and $M$.
- $G = \{E_\mu\}$ is called the event space or the »universal set« $($German: "Grundmenge" ⇒ letter: "G"$)$ with $M$ possible outcomes.
$\text{Example 1:}$
- In the experiment »coin toss« there are only two possible outcomes, namely »heads« and »tails« ⇒ $M = 2$.
- In contrast, in the random experiment »throwing a roulette ball« a total of $M = 37$ different outcomes are possible, and it holds for the universal set in this case:
- $$G = \{E_\mu\} = \{0, 1, 2, \text{...} \hspace{0.1cm} , 36\}.$$
Classical definition of probability
We assume that each trial results in exactly one outcome from $G$ and that each of these $M$ outcomes is possible in the same way $($without preference or disadvantage$)$.
$\text{Definition:}$ With this assumption, the »probability« of each outcome $E_μ$ is equally:
- $$\Pr (E_\mu) = 1/{M}.$$
This is the »classical definition of probability«. ${\rm Pr}(\text{...} )$ stands also for »probability« and is to be understood as a mathematical function.
$\text{Example 2:}$ In the random experiment »coin toss«, the probabilities of the two possible outcomes are
- $$\rm Pr(heads)=Pr(tails)=1/2.$$
- This assumes that each attempt ends either with »heads« or with »tails» and that the coin cannot come to rest on its edge during an attempt.
- In the experiment »throwing a roulette ball« the probabilities ${\rm Pr}( E_μ) = 1/37$ are equal for all numbers from $0$ to $36$
only if the roulette table has not been manipulated.
Note: Probability theory – and the statistics based on it – can only provide well-founded statements if all implicitly agreed conditions are actually fulfilled.
- Checking these conditions is not the task of statistics, but of those who use them.
- Since this basic rule is often violated, statistics has a much worse reputation in society than it actually deserves.
Event and event probability
$\text{Definitions:}$
(1) By an »event« we mean a set or summary of outcomes. We refer to the set of all events as the »event set« $\{A_i \}$.
- Since the number $I$ of possible events $\{A_i \}$ is generally not the same as the number $M$ of possible outcomes ⇒ the elements of $G = \{ E_μ \}$,
different indices are chosen here.
- Since the number $I$ of possible events $\{A_i \}$ is generally not the same as the number $M$ of possible outcomes ⇒ the elements of $G = \{ E_μ \}$,
(2) If an event $A_i$ is composed of $K$ $($elementary$)$ outcomes, the »event probability« is defined as follows:
- $${\rm Pr} (A_i) = \frac{K}{M} = \frac{\rm Number\hspace{0.1cm}of\hspace{0.1cm}favorable\hspace{0.1cm}outcomes}{\rm Number\hspace{0.1cm}of\hspace{0.1cm}possible\hspace{0.1cm}outcomes}.$$
This equation is called the »Laplace probability definition«.
- Here, »favorable outcomes« are those outcomes that belong to the composite event $A_i$.
- From this definition it is already clear that a probability must always lie between $0$ and $1$ $($including these two limits$)$.
$\text{Example 3:}$ We now consider the experiment »throwing a die«. The possible outcomes $($number of points$)$ are thus $E_μ ∈ G = \{1, 2, 3, 4, 5, 6\}$.
Let us now define two events $(I = 2)$, viz.
- $A_1 = \big[$the outcome is even$\big] = \{2, 4, 6\}$, and
- $A_2 = \big[$the outcome is odd$\big] = \{1, 3, 5\}$,
then the event set $\{A_1, A_2\}$ is equal to the universe $G$. For this example, the events $A_1$ and $A_2$ represent a so-called »complete system«.
On the other hand, the further event set $\{A_3, A_4\}$ is not equal to the universe $G$, if we define the single events as follows:
- $A_3 = \big[$the outcome is smaller than 3$\big] = \{1, 2\}$,
- $A_4 =\big[$the outcome is bigger than 3$\big] = \{4, 5, 6\}$.
Here, the event set $\{A_3, A_4\}$ does not include the element $3$. The probabilities of the events defined here are
- $${\rm Pr}( A_3) = 1/3,\ {\rm Pr}( A_1) ={\rm Pr}(A_2) = {\rm Pr}(A_4) = 1/2.$$
⇒ The topic of this chapter is illustrated with examples in the $($German language$)$ learning video
»Klassische Definition der Wahrscheinlickeit« ⇒ »Classical definition of probability«.
Exercises for the chapter
Exercise 1.1: A Special Dice Game
Exercise 1.1Z: Sum of Two Ternary Signals