Difference between revisions of "Information Theory"
Line 1: | Line 1: | ||
− | + | ===Brief summary=== | |
− | |||
− | *contained | + | {{BlueBox|TEXT=From the earliest beginnings of message transmission as an engineering discipline, it has been the endeavour of many engineers and mathematicians to find a quantitative measure for the |
+ | *contained $\rm information$ $($quite generally: "the knowledge about something"$)$ | ||
+ | *in a $\rm message$ $($here we mean "a collection of symbols and/or states"$)$. | ||
+ | |||
− | The (abstract) information is communicated by the (concrete) message and can be | + | The $($abstract$)$ information is communicated by the $($concrete$)$ message and can be conceived as the interpretation of a message. |
− | + | [https://de.wikipedia.org/wiki/Claude_Shannon '''Claude Elwood Shannon'''] succeeded in 1948, in establishing a consistent theory about the information content of messages, which was revolutionary in its time and created a new, still highly topical field of science: »'''Shannon's information theory«''' named after him.» | |
− | + | ||
+ | This is what the fourth book in the $\rm LNTww$ series deals with, in particular: | ||
+ | # Entropy of discrete-value sources with and withott memory, as well as natural message sources: Definition, meaning and computational possibilities. | ||
+ | # Source coding and data compression, especially the "Lempel–Ziv–Welch method" and "Huffman's entropy encoding". | ||
+ | # Various entropies of two-dimensional discrete-value random quantities. Mutual information and channel capacity. Application to digital signal transmission. | ||
+ | # Discrete-value information theory. Differential entropy. AWGN channel capacity with continuous-valued as well as discrete-valued input. | ||
+ | |||
+ | |||
+ | ⇒ First a »'''content overview'''« on the basis of the »'''four main chapters'''« with a total of »'''13 individual chapters'''« and »'''106 sections'''«:}} | ||
+ | |||
+ | |||
+ | |||
+ | ===Content=== | ||
− | |||
− | |||
{{Collapsible-Kopf}} | {{Collapsible-Kopf}} | ||
{{Collapse1| header=Entropy of Discrete Sources | {{Collapse1| header=Entropy of Discrete Sources | ||
Line 39: | Line 51: | ||
}} | }} | ||
{{Collapsible-Fuß}} | {{Collapsible-Fuß}} | ||
+ | |||
+ | {{BlaueBox|TEXT= | ||
+ | In addition to these theory pages, we also offer exercises and multimedia modules on this topic, which could help to clarify the teaching material: | ||
+ | |||
+ | $(1)$ [https://en.lntwww.de/Category:Information_Theory:_Exercises $\text{Exercises}$] | ||
+ | |||
+ | $(2)$ [[LNTwww:Learning_videos_to_"Information_Theory"|$\text{Learning videos}$]] | ||
+ | |||
+ | $(3)$ [[Applets_to_"Information_Theory"|$\text{Applets}$]] }} | ||
+ | |||
+ | |||
+ | LNTwww:Imprint for the book "Stochastic Signal Theory" | ||
+ | ===Further links=== | ||
+ | |||
+ | {{BlaueBox|TEXT= | ||
+ | $(4)$ [[LNTwww:Bibliography_to_"Theory_of_Stochastic_Signals"|$\text{Bibliography}$]] | ||
+ | |||
+ | $(5)$ [[LNTwww:Imprint_for_the_book_"Stochastic_Signal_Theory"|$\text{Impressum}$]]}} | ||
+ | <br><br> | ||
+ | |||
In addition to these theory pages, we also offer exercises and multimedia modules that could help to clarify the teaching material: | In addition to these theory pages, we also offer exercises and multimedia modules that could help to clarify the teaching material: |
Revision as of 15:17, 8 March 2023
Brief summary
From the earliest beginnings of message transmission as an engineering discipline, it has been the endeavour of many engineers and mathematicians to find a quantitative measure for the
- contained $\rm information$ $($quite generally: "the knowledge about something"$)$
- in a $\rm message$ $($here we mean "a collection of symbols and/or states"$)$.
The $($abstract$)$ information is communicated by the $($concrete$)$ message and can be conceived as the interpretation of a message.
Claude Elwood Shannon succeeded in 1948, in establishing a consistent theory about the information content of messages, which was revolutionary in its time and created a new, still highly topical field of science: »Shannon's information theory« named after him.»
This is what the fourth book in the $\rm LNTww$ series deals with, in particular:
- Entropy of discrete-value sources with and withott memory, as well as natural message sources: Definition, meaning and computational possibilities.
- Source coding and data compression, especially the "Lempel–Ziv–Welch method" and "Huffman's entropy encoding".
- Various entropies of two-dimensional discrete-value random quantities. Mutual information and channel capacity. Application to digital signal transmission.
- Discrete-value information theory. Differential entropy. AWGN channel capacity with continuous-valued as well as discrete-valued input.
⇒ First a »content overview« on the basis of the »four main chapters« with a total of »13 individual chapters« and »106 sections«:
Content
In addition to these theory pages, we also offer exercises and multimedia modules on this topic, which could help to clarify the teaching material:
$(1)$ $\text{Exercises}$
$(2)$ $\text{Learning videos}$
$(3)$ $\text{Applets}$
LNTwww:Imprint for the book "Stochastic Signal Theory"
Further links
In addition to these theory pages, we also offer exercises and multimedia modules that could help to clarify the teaching material:
$\text{Other links:}$
$(1)$ $\text{Bibliography to the book}$
$(2)$ $\text{General notes about the book}$ (authors, other participants, materials as a starting point for the book, list of sources)