Information Theory addresses some fundamental questions about systems that store or communicate data.

The answer to the first set of questions leads to the concept of Channel Capacity, and the second set to Entropy. Information Theory is therefore an essential weapon in a communication engineer's arsenal.

Information Theory, due to the nature of its subject matter, also makes fundamental contributions to statistical physics (thermodynamics), computer science (string complexity), economics (optimal portfolios), probability theory (large deviations) and statistics (Fisher information, hypothesis testing). This makes Information Theory a useful tool for students of other disciplines.

 

Course Description (Syllabus)

Learn about Claude E. Shannon's description of information (shannon.pdf) and realize how it limits reliable communication over noisy channels. Find out about the early days of error control coding (1950-1970) and acquire a fundamental understanding of algebraic coding theory. Identify the different constraints of error control coding in applications such as deep space communications (e.g. Voyager, Giotto), storage (e.g. CD-ROM, hard disk drives), optical and wireless communications (e.g. IS-95, GSM, CDMA 2000, UMTS) (1970-today). Discover the fascinating concept of concatenated coding and iterative (turbo) decoding, an information processing technique that finds its analogy in turbo engines (1993-today). Study how iterative processing lets us approach Shannon's limits.