Difference between revisions of "Information Theory"
Line 2: | Line 2: | ||
{{BlueBox|TEXT=From the earliest beginnings of message transmission as an engineering discipline, it has been the endeavour of many engineers and mathematicians to find a quantitative measure for the | {{BlueBox|TEXT=From the earliest beginnings of message transmission as an engineering discipline, it has been the endeavour of many engineers and mathematicians to find a quantitative measure for the | ||
− | *contained information (quite generally: | + | *contained information (quite generally: »the knowledge about something«) |
− | *in a message (here we mean | + | *in a message (here we mean »a collection of symbols and/or states»). |
Line 11: | Line 11: | ||
[https://en.wikipedia.org/wiki/Claude_Shannon '''Claude Elwood Shannon'''] succeeded in 1948, in establishing a consistent theory about the information content of messages, which was revolutionary in its time and created a new, still highly topical field of science: »'''Shannon's information theory«''' named after him. | [https://en.wikipedia.org/wiki/Claude_Shannon '''Claude Elwood Shannon'''] succeeded in 1948, in establishing a consistent theory about the information content of messages, which was revolutionary in its time and created a new, still highly topical field of science: »'''Shannon's information theory«''' named after him. | ||
− | This is what the fourth book in the $\rm | + | This is what the fourth book in the $\rm LNTwww$ series deals with, in particular: |
# Entropy of discrete-value sources with and without memory, as well as natural message sources: Definition, meaning and computational possibilities. | # Entropy of discrete-value sources with and without memory, as well as natural message sources: Definition, meaning and computational possibilities. | ||
− | # Source coding and data compression, especially the | + | # Source coding and data compression, especially the »Lempel–Ziv–Welch method« and »Huffman's entropy encoding«. |
# Various entropies of two-dimensional discrete-value random quantities. Mutual information and channel capacity. Application to digital signal transmission. | # Various entropies of two-dimensional discrete-value random quantities. Mutual information and channel capacity. Application to digital signal transmission. | ||
# Discrete-value information theory. Differential entropy. AWGN channel capacity with continuous-valued as well as discrete-valued input. | # Discrete-value information theory. Differential entropy. AWGN channel capacity with continuous-valued as well as discrete-valued input. |
Latest revision as of 18:50, 31 December 2023
Brief summary
From the earliest beginnings of message transmission as an engineering discipline, it has been the endeavour of many engineers and mathematicians to find a quantitative measure for the
- contained information (quite generally: »the knowledge about something«)
- in a message (here we mean »a collection of symbols and/or states»).
The (abstract) information is communicated by the (concrete) message and can be conceived as the interpretation of a message.
Claude Elwood Shannon succeeded in 1948, in establishing a consistent theory about the information content of messages, which was revolutionary in its time and created a new, still highly topical field of science: »Shannon's information theory« named after him.
This is what the fourth book in the LNTwww series deals with, in particular:
- Entropy of discrete-value sources with and without memory, as well as natural message sources: Definition, meaning and computational possibilities.
- Source coding and data compression, especially the »Lempel–Ziv–Welch method« and »Huffman's entropy encoding«.
- Various entropies of two-dimensional discrete-value random quantities. Mutual information and channel capacity. Application to digital signal transmission.
- Discrete-value information theory. Differential entropy. AWGN channel capacity with continuous-valued as well as discrete-valued input.
⇒ First a »content overview« on the basis of the »four main chapters« with a total of »13 individual chapters« and »106 sections«:
Content
Exercises and multimedia
In addition to these theory pages, we also offer exercises and multimedia modules on this topic, which could help to clarify the teaching material:
(1) Exercises
(2) Learning videos
(3) Applets
Further links