Information Theory addresses some fundamental questions about systems that store or communicate data.

• Most electronic communication channels are noisy. Telephone and radio channels are more obviously so, but even coaxial cables and fiber optic links will occasionally ``flip a 0 to a 1,'' especially if we try to push data through them at ever increasing rates. How can one send data in an essentially error-free manner over such noisy communication channels? What is the ultimate transmission rate that a channel can support?
• Many electronic storage systems employ some form of data compression. Data compression is also used for transmission in order to reduce the amount of data that needs to be transmitted. What is the ultimate compression factor that can be achieved while guaranteeing perfect recovery of the compressed data? What more can be achieved if one is willing to tolerate some prespecified small distortion of the recovered data (as in images or video)?

The answer to the first set of questions leads to the concept of Channel Capacity, and the second set to Entropy. Information Theory is therefore an essential weapon in a communication engineer's arsenal.

Information Theory, due to the nature of its subject matter, also makes fundamental contributions to statistical physics (thermodynamics), computer science (string complexity), economics (optimal portfolios), probability theory (large deviations) and statistics (Fisher information, hypothesis testing). This makes Information Theory a useful tool for students of other disciplines.

Course Description (Syllabus)

Learn about Claude E. Shannon's description of information (shannon.pdf) and realize how it limits reliable communication over noisy channels. Find out about the early days of error control coding (1950-1970) and acquire a fundamental understanding of algebraic coding theory. Identify the different constraints of error control coding in applications such as deep space communications (e.g. Voyager, Giotto), storage (e.g. CD-ROM, hard disk drives), optical and wireless communications (e.g. IS-95, GSM, CDMA 2000, UMTS) (1970-today). Discover the fascinating concept of concatenated coding and iterative (turbo) decoding, an information processing technique that finds its analogy in turbo engines (1993-today). Study how iterative processing lets us approach Shannon's limits.