Difference between revisions of "Information Theory"
From LNTwww
Line 36: | Line 36: | ||
*[[/Differential Entropy/]] | *[[/Differential Entropy/]] | ||
*[[/AWGN Channel Capacity for Continuous Input/]] | *[[/AWGN Channel Capacity for Continuous Input/]] | ||
− | *[[/AWGN Channel Capacity for Discrete Input/]] | + | *[[/AWGN Channel Capacity for Discrete-Valued Input/]] |
}} | }} | ||
{{Collapsible-Fuß}} | {{Collapsible-Fuß}} |
Revision as of 16:14, 25 February 2023
Since the early beginnings of communications as an engineering discipline, many engineers and mathematicians have sought
- to find a quantitative measure of the $\rm information$ $($in general: "the knowledge of something"$)$
- contained in a $\rm message$ $($here we understand "a collection of symbols and/or states"$)$.
The (abstract) information is communicated by the (concrete) message and can be seen as an interpretation of a message.
- Claude Elwood Shannon succeeded in 1948 in establishing a consistent theory of the information content of messages,
which was revolutionary in its time and created a new, still highly topical field of science:
The theory named after him: $\text{Shannon's Information Theory}$.
Here first a »content overview« on the basis of the »four main chapters« with a total of »13 individual chapters«:
Contents
In addition to these theory pages, we also offer exercises and multimedia modules that could help to clarify the teaching material:
$\text{Other links:}$
$(1)$ $\text{Bibliography to the book}$
$(2)$ $\text{General notes about the book}$ (authors, other participants, materials as a starting point for the book, list of sources)