ELEN 244: Syllabus

Ivana Maric, Spring 2016

This course teaches fundamentals of information theory. The following topics will be included: information measures, lossless data compression, the channel coding theorem, the joint source-channel coding theorem, Gaussian channels. Few topics from network information theory including broadcast and multi-access will be covered. The recently proposed family of capacity-achieving channel codes - polar codes – will be introduced (time-permitted).

The more detailed syllabus (but subject to small changes) is as follows.

Lecture 1:

Introduction. Communications problem. Entropy, Relative entropy, Mutual information and their properties (Chapter 2)

Lecture 2:

Jensen’s inequality, Data processing inequality, Fano’s inequality (Chapter 2)

Lecture 3:

The Asymptotic Equipartition Property (AEP) (Chapter 3)

Lecture 4:

Lossless data compression. Kraft inequality. Optimal codes (Chapter 5)

Lecture 5:

Lossless data compression (continued). Huffman Codes. Channel capacity. Examples. Jointly typical sequencies. (Chapters 5 & 7)

Lecture 6:


Lecture 7:

Jointly typical sequences. The channel coding theorem (Chapter 7)

Lecture 8:

The channel coding theorem (continued). The joint source-channel coding theorem (Chapter 7)

Lecture 9:

Gaussian channels: definition and capacity. Bandlimited channels. Parallel Gaussian channels (Chapter 9)

Lecture 10:

Network Information Theory. The multi-access channel. The broadcast channel (Section 14 & Reference 2). Review