Difference between revisions of "Information Theory"
(22 intermediate revisions by 4 users not shown) | |||
Line 1: | Line 1: | ||
− | + | Since the early beginnings of communications as an engineering discipline, many engineers and mathematicians have sought to find a quantitative measure of | |
− | * | + | *the $\rm Information$ (in general: "the knowledge of something") contained in a $\rm message$ (here we understand "a collection of symbols and/or states"). |
− | |||
− | + | The (abstract) information is communicated by the (concrete) message and can be seen as an interpretation of a message. | |
− | [https://de.wikipedia.org/wiki/Claude_Shannon Claude Elwood Shannon] | + | [https://de.wikipedia.org/wiki/Claude_Shannon Claude Elwood Shannon] succeeded in 1948 in establishing a consistent theory of the information content of messages, which was revolutionary in its time and created a new, still highly topical field of science: the theory named after him $\text{Shannon's Information Theory}$. |
− | + | The course material corresponds to a $\text{lecture with two semester hours per week (SWS) and one SWS exercise}$. | |
− | + | Here is a table of contents based on the $\text{four main chapters}$ with a total of $\text{13 individual chapters}$. | |
− | === | + | ===Contents=== |
{{Collapsible-Kopf}} | {{Collapsible-Kopf}} | ||
− | {{Collapse1| header= | + | {{Collapse1| header=Entropy of Discrete Sources |
| submenu= | | submenu= | ||
− | *[[/ | + | *[[/Discrete Memoryless Sources/]] |
− | *[[/ | + | *[[/Discrete Sources with Memory/]] |
− | *[[/ | + | *[[/Natural Discrete Sources/]] |
}} | }} | ||
− | {{Collapse2 | header= | + | {{Collapse2 | header=Source Coding - Data Compression |
|submenu= | |submenu= | ||
− | *[[/ | + | *[[/General Description/]] |
− | *[[/ | + | *[[/Compression According to Lempel, Ziv and Welch/]] |
− | *[[/ | + | *[[/Entropy Coding According to Huffman/]] |
− | *[[/ | + | *[[/Further Source Coding Methods/]] |
}} | }} | ||
− | {{Collapse3 | header=Information | + | {{Collapse3 | header=Mutual Information Between Two Discrete Random Variables |
|submenu= | |submenu= | ||
− | *[[/ | + | *[[/Some Preliminary Remarks on Two-Dimensional Random Variables/]] |
− | *[[/ | + | *[[/Different Entropy Measures of Two-Dimensional Random Variables/]] |
− | *[[/ | + | *[[/Application to Digital Signal Transmission/]] |
}} | }} | ||
− | {{Collapse4 | header= | + | {{Collapse4 | header=Information Theory for Continuous Random Variables |
|submenu= | |submenu= | ||
− | *[[/ | + | *[[/Differential Entropy/]] |
− | *[[/ | + | *[[/AWGN Channel Capacity for Continuous Input/]] |
− | *[[/ | + | *[[/AWGN Channel Capacity for Discrete Input/]] |
}} | }} | ||
{{Collapsible-Fuß}} | {{Collapsible-Fuß}} | ||
− | + | In addition to these theory pages, we also offer Exercises and multimedia modules that could help to clarify the teaching material: | |
− | *[https://en.lntwww.de/Kategorie:Aufgaben_zu_Informationstheorie $\text{ | + | *[https://en.lntwww.de/Kategorie:Aufgaben_zu_Informationstheorie $\text{Exercises}$] |
− | *[[LNTwww:Lernvideos_zu_Informationstheorie|$\text{ | + | *[[LNTwww:Lernvideos_zu_Informationstheorie|$\text{Learning videos}$]] |
− | *[[LNTwww:HTML5-Applets_zu_Informationstheorie|$\text{ | + | *[[LNTwww:HTML5-Applets_zu_Informationstheorie|$\text{redesigned applets}$]], based on HTML5, also executable on smartphones |
− | *[[LNTwww:SWF-Applets_zu_Informationstheorie|$\text{ | + | *[[LNTwww:SWF-Applets_zu_Informationstheorie|$\text{former Applets}$]], based on SWF, executable only under WINDOWS with ''Adobe Flash Player''. |
<br><br> | <br><br> | ||
− | $\text{ | + | $\text{More links:}$ |
<br><br> | <br><br> | ||
− | $(1)$ [[LNTwww:Literaturempfehlung_zu_Informationstheorie|$\text{ | + | $(1)$ [[LNTwww:Literaturempfehlung_zu_Informationstheorie|$\text{Recommended literature for the book}$]] |
− | $(2)$ [[LNTwww: | + | $(2)$ [[LNTwww:Weitere_Hinweise_zum_Buch_Informationstheorie|$\text{General notes about the book}$]] (Authors, other participants, materials as a starting point for the book, list of sources) |
<br><br> | <br><br> | ||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
__NOTOC__ | __NOTOC__ | ||
__NOEDITSECTION__ | __NOEDITSECTION__ |
Revision as of 10:57, 16 June 2021
Since the early beginnings of communications as an engineering discipline, many engineers and mathematicians have sought to find a quantitative measure of
- the $\rm Information$ (in general: "the knowledge of something") contained in a $\rm message$ (here we understand "a collection of symbols and/or states").
The (abstract) information is communicated by the (concrete) message and can be seen as an interpretation of a message.
Claude Elwood Shannon succeeded in 1948 in establishing a consistent theory of the information content of messages, which was revolutionary in its time and created a new, still highly topical field of science: the theory named after him $\text{Shannon's Information Theory}$.
The course material corresponds to a $\text{lecture with two semester hours per week (SWS) and one SWS exercise}$.
Here is a table of contents based on the $\text{four main chapters}$ with a total of $\text{13 individual chapters}$.
Contents
In addition to these theory pages, we also offer Exercises and multimedia modules that could help to clarify the teaching material:
- $\text{Exercises}$
- $\text{Learning videos}$
- $\text{redesigned applets}$, based on HTML5, also executable on smartphones
- $\text{former Applets}$, based on SWF, executable only under WINDOWS with Adobe Flash Player.
$\text{More links:}$
$(1)$ $\text{Recommended literature for the book}$
$(2)$ $\text{General notes about the book}$ (Authors, other participants, materials as a starting point for the book, list of sources)