Information Theory

Course Description

The course on information theory is concerned about the understanding of fundamental limits for data compressibility and the rate at which data may be reliably communicated over a noisy channel. As topics we cover measures of information, entropy, mutual information, Markov chains, source coding theorem, data compression, noisy channel coding theorem, error-correcting codes, and bounds on the performance of communication systems. Also, we devote part of the course to study the use of information measurements to problems of statistical learning and application in areas such as machine learning, image processing, finances among other problems.

Prerequisites

Knowledge of probability and stochastic processes is required.

Course Goal

To provide students with the ability to learn about entropy, mutual information and divergence, their basic properties, how they relate to information transmission. Also, besides the classical topics of information transmission, we expect the student to use infomration theoretic concepts f to solve problems in the areas of learning and signal processing, especially in source separation where information theory plays a key role.

Program / Syllabus

  • Introduction
  • Review on probability theory
  • Information and Entropy
  • Source coding
  • Channel coding and capacity
  • Information-theoretic learning
  • Independent component analysis

Textbooks

  1. Thomas M. Cover and Joy A. Thomas, Elements of Information Theory. John Wiley & Sons, Second edition, 2006.
  2. David J. C. MacKay, Information Theory, Inference and Learning Algorithms. Cambridge University Press, 2003.
  3. Simon Haykin, Neural Networks and Learning Machines. Prentice Hall, Third edition, 2008.

Material (in Portuguese)

Video: Information and Entropy

top