EE5357 Course Information

Table of Contents

Welcome to EE5357 (Estimation Theory). This is a follow-up course to EE 2340 (Information Sciences) and EE 5847 (Information theory). While not a strict prerequisite, it would be helpful if you have already taken EE5390 (Source coding) or EE5342 (Detection theory).

Most of our interaction will take place on Piazza and Google classroom. Please send me an email if you have not received an invite.

Classes will be online. We will mainly have live lectures, but these might be supplemented with a few recorded videos.

Prerequisites

1 Assessment (tentative):

Each student will be expected to

  • attend online lectures and watch recorded videos
  • solve short in-class quizzes
  • participate in class and on piazza
  • solve homework assignments, roughly 2 (collaboration is fine as long as it is explicitly declared, but you have to write the solutions in your own words)
  • present a paper/do a project
Homeworks 50%
Attendance and class participation 10%
Quizzes 10%
Final project/paper presentation 30%

2 Instructor:

Name Dr. Shashank Vatedka
Email shashankvatedka@ee.iith.ac.in

3 Class timings:

  • Slot D: Mondays 12:00-13:00, Tuesdays 09:00-10:00 and Fridays 11:00-12:00
  • Class venue: Online

4 Textbook and references:

Primarily, the material will be drawn from

  • Statistical Inference, second edition (CB), George Casella and Roger L Berger
  • Lecture notes and slides, uploaded on piazza.

Other references:

  • An introduction to signal detection and estimation, HV Poor
  • Theory of point estimation, EL Lehmann and George Casella
  • Graphical models, exponential families and variational inference, Martin J Wainwright and Michael I Jordan
  • Elements of Information Theory, Thomas M Cover and Joy A Thomas, second edition, Wiley inc. (Amazon link). Limited number of copies available in the university library. While the first edition of the book has all the material we need for the course, the second edition has a much expanded problem set.
  • Information theory, inference, and learning algorithms, David MacKay (soft copy available for free on the author's website). A very nice book, but gives a very different flavour that what will be covered in the course. Suggested for additional reading.
  • A student's guide to coding and information theory, Stefan Moser (amazon link).
  • Information theory: coding theorems for discrete memoryless systems, Imre Csiszar and Janos Korner. An advanced textbook. Recommended reading after covering this course as well as EE6317.

Light reading:

5 Tentative syllabus:

Topic Week Notes/reference/material Homeworks
Introduction to estimation and regression      
- estimation and regression problems      
- sufficient statistics      
Point estimation      
- MSE estimation      
- Unbiased estimators, best unbiased estimators      
- Minimum variance unbiased estimators and the CR bound      
- Rao-Blackwell theorem      
- Bayes risk      
Maximum likelihood estimation      
- EM algorithm      
MMSE estimation      
- Linear MMSE estimation      
- Wiener and Kalman filter (if time permits)      
- James-Stein estimator      
Other topics (time permitting)      
- Interval estimation      
- Regression      

6 Topics for final paper/project presentation

You may work in groups of at most 2 people each. Specific topics will be updated here. Good sources for the latest work on estimation and regression can be found in the IEEE Transactions on Information Theory, Journal of Machine learning research, various conferences including the annual Conference on Learning Theory (COLT), Neural Information Processing Systems (NeurIPS), International Conference on Machine Learning (ICML), IEEE International Symposium on Information Theory (ISIT), Information Theory Workshop (ITW). Several good preprints are made available online on the Arxiv preprint server (CS.IT and CS.ML).

You may, with approval, pick up any topic on estimation theory from any good textbook and give a presentation on this. Topics from this book, or this book or these lecture notes.

  • There is a large body of work on estimating the entropy of an unknown distribution from iid samples. These lecture notes give a summary of various entropy estimators for discrete distributions. The LP estimator described in the notes is based on this breakthrough work which showed that one can significantly improve upon the plugin estimator. Paninski showed that there is no unbiased estimator for the entropy. A different approach to the LP estimator was recently given by Wu and Yang. These ideas have been generalized to estimating more general functionals of discrete distributions.
  • This paper gives a unified approach for estimating functionals of distributions by first estimating the histogram and then using a plugin estimate.
  • Distributed mean estimation
  • Properties of Good-Turing estimation
  • This monograph is a great introduction to graphical models and variational methods for inference.
  • Any of the top 100 publications in this list would be good.

Author: Shashank Vatedka

Created: 2021-04-19 Mon 18:29