Loading [MathJax]/jax/output/HTML-CSS/fonts/TeX/fontdata.js

Difference between revisions of "Information Theory"

From LNTwww
Line 35: Line 35:
 
{{Collapse4 | header=Information Theory for Continuous Random Variables
 
{{Collapse4 | header=Information Theory for Continuous Random Variables
 
|submenu=
 
|submenu=
*[[/Differential entropy/]]
+
*[[/Differential Entropy/]]
*[[/AWGN channel capacity with continuous value input/]]
+
*[[/AWGN Channel Capacity with Continuous Value Input/]]
*[[/AWGN channel capacity with discrete value input/]]
+
*[[/AWGN Channel Capacity with Discrete Value Input/]]
 
}}
 
}}
 
{{Collapsible-Fuß}}
 
{{Collapsible-Fuß}}

Revision as of 16:29, 28 March 2021

Since the early beginnings of communications as an engineering discipline, many engineers and mathematicians have sought to find a quantitative measure of

  • the Information  (in general: "the knowledge of something") contained in a  message  (here we understand "a collection of symbols and/or states").


The (abstract) information is communicated by the (concrete) message and can be seen as an interpretation of a message.

Claude Elwood Shannon  succeeded in 1948 in establishing a consistent theory of the information content of messages, which was revolutionary in its time and created a new, still highly topical field of science:  the theory named after him  Shannon's Information Theory.

The course material corresponds to a  lecture with two semester hours per week (SWS) and one SWS exercise.

Here is a table of contents based on the  four main chapters  with a total of  13 individual chapters.


Inhalt

In addition to these theory pages, we also offer Exercises and multimedia modules that could help to clarify the teaching material:



More links:

(1)    Recommended literature for the book

(2)    General notes about the book   (Authors,  other participants,  materials as a starting point for the book,  list of sources)