EE2340/5847 Course Information

Table of Contents

Welcome to EE 2340 (Information Sciences) and EE 5847 (Information theory) 2022 edition. This course will cover the basics of Information Theory. You are also encouraged to take the follow-up courses EE5390 (Source coding), EE6317 (Channel coding), as well as the related course EE5342 (Detection theory) and EE5357 (Estimation theory).

Generally speaking, information theory has its roots in this paper by Claude Shannon, and dealt with the following problems:

Since its inception, information theory has proved to be a valuable tool in several fields including cryptography, physics, machine learning and computational biology.

In this course, you will be introduced to basic information quantities and their mathematical properties. In the process, you will see probabilistic models of basic data compression and communication problems, how to performance of algorithms for these two, and the fundamental limits associated with them.

Classes tentatively will be in-person online. To facilitate discussions, we will use Google Classroom. An invite will be sent to all registered students in the first class.

1 Assessment (tentative):

Each student will be expected to

  • attend classes regularly, and participate in class
  • solve homework assignments, roughly 2-3 (collaboration is fine as long as it is explicitly declared, but you have to write the solutions in your own words)
  • solve in-class quizzes
  • solve a final exam

The following assessment scheme is tentative:

Homeworks 40%
Quizzes 20%
Attendance and class participation 10%
Final exam 30%

2 Instructor:

Name Dr. Shashank Vatedka
Office 214/D, Block-C
Email shashankvatedka@ee.iith.ac.in

3 Class timings:

  • Slot E: Tuesdays 10-11, Thursdays 12-1 and Fridays 9-10
  • Venue: Google meet
  • Quiz schedule: TBD

4 Tentative syllabus:

What is information theory?; Probabilistic modeling of sources and channels; The source compression problem; Entropy and the source coding theorem; Proof of achievability for binary memoryless sources; Communication across a noisy channel; Mutual information and the channel coding theorem; Proof of achievability for binary symmetric channel; Convexity, Jensen's inequality; Entropy, joint entropy and conditional entropy, properties; Kullback-Liebler divergence, properties; Mutual information, properties; Chain rules; Log-sum inequality; Data processing inequality; Sufficient statistics; Fano's inequality

5 Textbook and references:

Primarily, the material will be drawn from

  • Lecture notes, which will be shared.
  • Elements of Information Theory, Thomas M Cover and Joy A Thomas, second edition, Wiley inc. (Amazon link). Limited number of copies available in the university library. While the first edition of the book has all the material we need for the course, the second edition has a much expanded problem set.
  • Information theory, inference, and learning algorithms, David MacKay (soft copy available for free on the author's website). A very nice book, but gives a very different flavour that what will be covered in the course. Suggested for additional reading.

Other references:

  • "A mathematical theory of communication", Claude Shannon, Bell systems Tech Journal, 1948. The classic paper that started this field.
  • A student's guide to coding and information theory, Stefan Moser (amazon link).
  • Information theory: coding theorems for discrete memoryless systems, Imre Csiszar and Janos Korner. An advanced textbook. Recommended reading after covering this course as well as EE5390 and EE6317.
  • Similar courses offered at IISc, Stanford, and MIT. Also one on Coursera.

Light reading:

6 Tentative schedule

Lecture recordings can be viewed via this YouTube playlist.

Topic Date Notes/links/material Homeworks
Introduction Week 1 Notes  
- Logistics   MIT Probability lectures Homework 1
- Motivation for studying information theory      
- A refresher in probability      
- Probabilistic modeling of sources and channels: memoryless systems      
Data compression for discrete sources Week 2 Handwritten class notes  
- Fixed-length compression: rate, probability of error   Notes  
- Entropy and the source coding theorem      
- Proof of achievability for Bernoulli source      
Discrete memoryless channels   Handwritten class notes  
Model, modular structure of digital communication system   Notes  
Channel coding, rate, capacity, probability of error Week 2    
Mutual information and the channel coding theorem      
Coding as a packing problem and capacity of the binary symmetric channel      
Information measures for continuous alphabet      
Properties of information measures Week 3 Notes  
Convexity, Jensen's inequality   Handwritten notes  
- Joint and conditional entropies      
- Positivity      
- Chain rule of entropy and mutual information      
- Conditioning reduces entropy      
More properties Week4-5    
Log-sum inequality      
Properties      
- Positivity, convexity and concavity      
- Chain rule for mutual information      
- Data processing inequality      
Fano's inequality      
Applications Week 5    
- A proof of converse of the source coding theorem      
- A proof of converse of the channel coding theorem      
- Entropy maximizing distributions      
Final exam TBD    

Author: Shashank Vatedka

Created: 2022-02-08 Tue 11:20