Difference between revisions of "Information Theory"

From LNTwww
(10 intermediate revisions by 2 users not shown)
Line 12: Line 12:
 
   
 
   
  
===Inhalt===
+
===Contents===
 
{{Collapsible-Kopf}}
 
{{Collapsible-Kopf}}
 
{{Collapse1| header=Entropy of Discrete Sources
 
{{Collapse1| header=Entropy of Discrete Sources
 
| submenu=  
 
| submenu=  
 
*[[/Discrete Memoryless Sources/]]
 
*[[/Discrete Memoryless Sources/]]
*[[/Sources with Memory/]]
+
*[[/Discrete Sources with Memory/]]
 
*[[/Natural Discrete Sources/]]
 
*[[/Natural Discrete Sources/]]
 
}}
 
}}
 
{{Collapse2 | header=Source Coding - Data Compression
 
{{Collapse2 | header=Source Coding - Data Compression
 
|submenu=
 
|submenu=
*[[/Allgemeine Beschreibung/]]
+
*[[/General Description/]]
*[[/Komprimierung nach Lempel, Ziv und Welch/]]
+
*[[/Compression According to Lempel, Ziv and Welch/]]
*[[/Entropiecodierung nach Huffman/]]
+
*[[/Entropy Coding According to Huffman/]]
*[[/Weitere Quellencodierverfahren/]]
+
*[[/Further Source Coding Methods/]]
 
}}
 
}}
 
{{Collapse3 | header=Mutual Information Between Two Discrete Random Variables
 
{{Collapse3 | header=Mutual Information Between Two Discrete Random Variables
 
|submenu=
 
|submenu=
*[[/Einige Vorbemerkungen zu zweidimensionalen Zufallsgrößen/]]
+
*[[/Some Preliminary Remarks on Two-Dimensional Random Variables/]]
*[[/Verschiedene Entropien zweidimensionaler Zufallsgrößen/]]
+
*[[/Different Entropy Measures of Two-Dimensional Random Variables/]]
*[[/Anwendung auf die Digitalsignalübertragung/]]
+
*[[/Application to Digital Signal Transmission/]]
 
}}
 
}}
 
{{Collapse4 | header=Information Theory for Continuous Random Variables
 
{{Collapse4 | header=Information Theory for Continuous Random Variables
 
|submenu=
 
|submenu=
*[[/Differentielle Entropie/]]
+
*[[/Differential Entropy/]]
*[[/AWGN–Kanalkapazität bei wertkontinuierlichem Eingang/]]
+
*[[/AWGN Channel Capacity for Continuous Input/]]
*[[/AWGN–Kanalkapazität bei wertdiskretem Eingang/]]
+
*[[/AWGN Channel Capacity for Discrete Input/]]
 
}}
 
}}
 
{{Collapsible-Fuß}}
 
{{Collapsible-Fuß}}

Revision as of 09:57, 16 June 2021

Since the early beginnings of communications as an engineering discipline, many engineers and mathematicians have sought to find a quantitative measure of

  • the $\rm Information$  (in general: "the knowledge of something") contained in a  $\rm message$  (here we understand "a collection of symbols and/or states").


The (abstract) information is communicated by the (concrete) message and can be seen as an interpretation of a message.

Claude Elwood Shannon  succeeded in 1948 in establishing a consistent theory of the information content of messages, which was revolutionary in its time and created a new, still highly topical field of science:  the theory named after him  $\text{Shannon's Information Theory}$.

The course material corresponds to a  $\text{lecture with two semester hours per week (SWS) and one SWS exercise}$.

Here is a table of contents based on the  $\text{four main chapters}$  with a total of  $\text{13 individual chapters}$.


Contents

In addition to these theory pages, we also offer Exercises and multimedia modules that could help to clarify the teaching material:



$\text{More links:}$

$(1)$    $\text{Recommended literature for the book}$

$(2)$    $\text{General notes about the book}$   (Authors,  other participants,  materials as a starting point for the book,  list of sources)