Difference between revisions of "Information Theory"
Line 35: | Line 35: | ||
{{Collapse4 | header=Information Theory for Continuous Random Variables | {{Collapse4 | header=Information Theory for Continuous Random Variables | ||
|submenu= | |submenu= | ||
− | *[[/Differential | + | *[[/Differential Entropy/]] |
− | *[[/AWGN | + | *[[/AWGN Channel Capacity with Continuous Value Input/]] |
− | *[[/AWGN | + | *[[/AWGN Channel Capacity with Discrete Value Input/]] |
}} | }} | ||
{{Collapsible-Fuß}} | {{Collapsible-Fuß}} |
Revision as of 16:29, 28 March 2021
Since the early beginnings of communications as an engineering discipline, many engineers and mathematicians have sought to find a quantitative measure of
- the Information (in general: "the knowledge of something") contained in a message (here we understand "a collection of symbols and/or states").
The (abstract) information is communicated by the (concrete) message and can be seen as an interpretation of a message.
Claude Elwood Shannon succeeded in 1948 in establishing a consistent theory of the information content of messages, which was revolutionary in its time and created a new, still highly topical field of science: the theory named after him Shannon's Information Theory.
The course material corresponds to a lecture with two semester hours per week (SWS) and one SWS exercise.
Here is a table of contents based on the four main chapters with a total of 13 individual chapters.
Inhalt
In addition to these theory pages, we also offer Exercises and multimedia modules that could help to clarify the teaching material:
- Exercises
- Learning videos
- redesigned applets, based on HTML5, also executable on smartphones
- former Applets, based on SWF, executable only under WINDOWS with Adobe Flash Player.
More links:
(1) Recommended literature for the book
(2) General notes about the book (Authors, other participants, materials as a starting point for the book, list of sources)