Shannon's entropy - Introduction to information theory

Lecturer: Jaime Sevilla

Date: 10/10/2017

Time: 17:00

Place: Room 104

Abstract:

Entropy is a versatile tool which helps the study and analysis of diverse areas.

In this talk we will axiomatically derive the definition of entropy, and understand some of its more common interpretations and properties.

We will then use our newfound knowledge to study the relation between thermodynamical and Shannon’s entropy, and its application to communication, data compression and investment.

Lastly, we will dip our toes in the advanced topic of algorithmic complexity, which further generalizes the concept of Shannon’s entropy to allow an schema of universal reasoning.

Keywords: Markov process interpretation of thermodynamic entropy, Kelly gambling, the noisy channel theorem, mutual information, relative entropy, kolmogorov complexity.

Slides

Slides about information theory in Prezi

Bibliography

Promotional poster

Poster