Difference between revisions of "Information Theory"
Line 18: | Line 18: | ||
*[[/Discrete Memoryless Sources/]] | *[[/Discrete Memoryless Sources/]] | ||
*[[/Sources with Memory/]] | *[[/Sources with Memory/]] | ||
− | *[[/ | + | *[[/Natural Discrete Sources/]] |
}} | }} | ||
{{Collapse2 | header=Source Coding - Data Compression | {{Collapse2 | header=Source Coding - Data Compression |
Revision as of 15:55, 25 November 2020
Since the early beginnings of communications as an engineering discipline, many engineers and mathematicians have sought to find a quantitative measure of
- the $\rm Information$ (in general: "the knowledge of something") contained in a $\rm message$ (here we understand "a collection of symbols and/or states").
The (abstract) information is communicated by the (concrete) message and can be seen as an interpretation of a message.
Claude Elwood Shannon succeeded in 1948 in establishing a consistent theory of the information content of messages, which was revolutionary in its time and created a new, still highly topical field of science: the theory named after him $\text{Shannon's Information Theory}$.
The course material corresponds to a $\text{lecture with two semester hours per week (SWS) and one SWS exercise}$.
Here is a table of contents based on the $\text{four main chapters}$ with a total of $\text{13 individual chapters}$.
Inhalt
In addition to these theory pages, we also offer Exercises and multimedia modules that could help to clarify the teaching material:
- $\text{Exercises}$
- $\text{Learning videos}$
- $\text{redesigned applets}$, based on HTML5, also executable on smartphones
- $\text{former Applets}$, based on SWF, executable only under WINDOWS with Adobe Flash Player.
$\text{More links:}$
$(1)$ $\text{Recommended literature for the book}$
$(2)$ $\text{General notes about the book}$ (Authors, other participants, materials as a starting point for the book, list of sources)