Difference between revisions of "Information Theory"
(45 intermediate revisions by 3 users not shown) | |||
Line 1: | Line 1: | ||
− | + | ===Brief summary=== | |
− | |||
+ | {{BlueBox|TEXT=From the earliest beginnings of message transmission as an engineering discipline, it has been the endeavour of many engineers and mathematicians to find a quantitative measure for the | ||
+ | *contained information (quite generally: »the knowledge about something«) | ||
− | + | *in a message $(here we mean »a collection of symbols and/or states»)$. | |
+ | |||
− | + | The (abstract) information is communicated by the $(concrete)$ message and can be conceived as the interpretation of a message. | |
− | + | [https://en.wikipedia.org/wiki/Claude_Shannon '''Claude Elwood Shannon'''] succeeded in 1948, in establishing a consistent theory about the information content of messages, which was revolutionary in its time and created a new, still highly topical field of science: »'''Shannon's information theory«''' named after him. | |
− | + | This is what the fourth book in the LNTwww series deals with, in particular: | |
− | + | # Entropy of discrete-value sources with and without memory, as well as natural message sources: Definition, meaning and computational possibilities. | |
+ | # Source coding and data compression, especially the »Lempel–Ziv–Welch method« and »Huffman's entropy encoding«. | ||
+ | # Various entropies of two-dimensional discrete-value random quantities. Mutual information and channel capacity. Application to digital signal transmission. | ||
+ | # Discrete-value information theory. Differential entropy. AWGN channel capacity with continuous-valued as well as discrete-valued input. | ||
+ | |||
+ | |||
+ | ⇒ First a »'''content overview'''« on the basis of the »'''four main chapters'''« with a total of »'''13 individual chapters'''« and »'''106 sections'''«:}} | ||
+ | |||
+ | |||
+ | |||
+ | ===Content=== | ||
− | |||
{{Collapsible-Kopf}} | {{Collapsible-Kopf}} | ||
{{Collapse1| header=Entropy of Discrete Sources | {{Collapse1| header=Entropy of Discrete Sources | ||
| submenu= | | submenu= | ||
*[[/Discrete Memoryless Sources/]] | *[[/Discrete Memoryless Sources/]] | ||
− | *[[/Sources with Memory/]] | + | *[[/Discrete Sources with Memory/]] |
*[[/Natural Discrete Sources/]] | *[[/Natural Discrete Sources/]] | ||
}} | }} | ||
Line 30: | Line 41: | ||
|submenu= | |submenu= | ||
*[[/Some Preliminary Remarks on Two-Dimensional Random Variables/]] | *[[/Some Preliminary Remarks on Two-Dimensional Random Variables/]] | ||
− | *[[/Different | + | *[[/Different Entropy Measures of Two-Dimensional Random Variables/]] |
*[[/Application to Digital Signal Transmission/]] | *[[/Application to Digital Signal Transmission/]] | ||
}} | }} | ||
{{Collapse4 | header=Information Theory for Continuous Random Variables | {{Collapse4 | header=Information Theory for Continuous Random Variables | ||
|submenu= | |submenu= | ||
− | *[[/Differential | + | *[[/Differential Entropy/]] |
− | *[[/AWGN | + | *[[/AWGN Channel Capacity for Continuous-Valued Input/]] |
− | *[[/AWGN | + | *[[/AWGN Channel Capacity for Discrete-Valued Input/]] |
}} | }} | ||
{{Collapsible-Fuß}} | {{Collapsible-Fuß}} | ||
− | In addition to these theory pages, we also offer | + | ===Exercises and multimedia=== |
− | + | ||
− | + | {{BlaueBox|TEXT= | |
− | + | In addition to these theory pages, we also offer exercises and multimedia modules on this topic, which could help to clarify the teaching material: | |
− | |||
+ | (1) [https://en.lntwww.de/Category:Information_Theory:_Exercises Exercises] | ||
+ | |||
+ | (2) [[LNTwww:Learning_videos_to_"Information_Theory"|Learning videos]] | ||
+ | |||
+ | (3) [[LNTwww:Applets_to_"Information_Theory"|Applets]] }} | ||
+ | |||
+ | |||
+ | ===Further links=== | ||
+ | |||
+ | {{BlaueBox|TEXT= | ||
+ | (4) [[LNTwww:Bibliography_to_"Information_Theory"|Bibliography]] | ||
+ | |||
+ | (5) [[LNTwww:Imprint_for_the_book_"Information_Theory"|Impressum]]}} | ||
<br><br> | <br><br> | ||
− | |||
− | |||
− | |||
− | + | ||
− | |||
__NOTOC__ | __NOTOC__ | ||
__NOEDITSECTION__ | __NOEDITSECTION__ |
Latest revision as of 18:50, 31 December 2023
Brief summary
From the earliest beginnings of message transmission as an engineering discipline, it has been the endeavour of many engineers and mathematicians to find a quantitative measure for the
- contained information (quite generally: »the knowledge about something«)
- in a message (here we mean »a collection of symbols and/or states»).
The (abstract) information is communicated by the (concrete) message and can be conceived as the interpretation of a message.
Claude Elwood Shannon succeeded in 1948, in establishing a consistent theory about the information content of messages, which was revolutionary in its time and created a new, still highly topical field of science: »Shannon's information theory« named after him.
This is what the fourth book in the LNTwww series deals with, in particular:
- Entropy of discrete-value sources with and without memory, as well as natural message sources: Definition, meaning and computational possibilities.
- Source coding and data compression, especially the »Lempel–Ziv–Welch method« and »Huffman's entropy encoding«.
- Various entropies of two-dimensional discrete-value random quantities. Mutual information and channel capacity. Application to digital signal transmission.
- Discrete-value information theory. Differential entropy. AWGN channel capacity with continuous-valued as well as discrete-valued input.
⇒ First a »content overview« on the basis of the »four main chapters« with a total of »13 individual chapters« and »106 sections«:
Content
Exercises and multimedia
In addition to these theory pages, we also offer exercises and multimedia modules on this topic, which could help to clarify the teaching material:
(1) Exercises
(2) Learning videos
(3) Applets
Further links