The course on information theory is concerned about the understanding of fundamental limits for data compressibility and the rate at which data may be reliably communicated over a noisy channel. As topics we cover measures of information, entropy, mutual information, Markov chains, source coding theorem, data compression, noisy channel coding theorem, error-correcting codes, and bounds on the performance of communication systems. Also, we devote part of the course to study the use of information measurements to problems of statistical learning and application in areas such as machine learning, image processing, finances among other problems.
To provide students with the ability to learn about entropy, mutual information and divergence, their basic properties, how they relate to information transmission. Also, besides the classical topics of information transmission, we expect the student to use infomration theoretic concepts f to solve problems in the areas of learning and signal processing, especially in source separation where information theory plays a key role.