Undergraduate Teaching 2019-20

Engineering Tripos Part IIA, 3F7: Information Theory and Coding, 2019-20

Engineering Tripos Part IIA, 3F7: Information Theory and Coding, 2019-20

Not logged in. More information may be available... Login via Raven / direct.

PDF versionPDF version

Leader

Dr R Venkataramanan

Lecturer

Dr R Venkataramanan

Lab Leader

Dr J Sayir

Timing and Structure

Michaelmas Term. 16 lectures. Assessment: 100% exam

Aims

The aims of the course are to:

  • To introduce students to the principles of information theory, data compression, and error-correction, which form the foundations of modern communication and information processing systems.

Objectives

As specific objectives, by the end of the course students should be able to:

  • Explain why entropy and channel capacity arise as fundamental limits for data compression and transmission, respectively
  • Understand and implement basic compression algorithms such as Huffman coding and Arithmetic coding
  • Encode and decode information using simple linear block codes
  • Implement decoding algorithms for modern error-correcting codes such as LDPC codes

Content

 

Information Theory and Data Compression (11L)

  1. Probability fundamentals; Definitions of entropy, joint entropy, conditional entropy: interpretations as measures of uncertainty 
  2. Noiseless source coding theorem; Significance of entropy as the fundamental limit of compression 
  3. Bounds on code length for lossless data compression
  4. Practical compression algorithms: Huffman coding, Arithmetic coding
  5. Relative Entropy, Mutual Information: Properties and some applications
  6. Discrete Memoryless Channels and Channel Capacity
  7. The channel coding theorem: Random coding and the direct coding theorem; Fano's inequality and the converse theorem
  8. The additive white Gaussian noise (AWGN) channel and its capacity

 

Channel Coding (Error-correcting codes) (5L)

  1. Introduction to block codes; Linear block codes
  2. Representing a linear code using a factor graph; Sparse-graph codes
  3. Message passing decoding of sparse-graph codes for binary erasure channels
  4. The Belief Propagation (BP) algorithm; BP decoding of sparse-graph codes for general binary input channels

Further notes

This module will be of interest to anyone who wishes to  understand how information can be mathematically modelled, measured, compressed, and transmitted. Though not a pre-requisite for 3F4, 3F7 provides a good foundation for further study of digital communications.

Coursework

Data Compression: Build your own CamZIP

Learning objectives

  • To implement various data compression algorithms in Python/Matlab/Octave
  • To compare the compression performance of different techniques on text files
  • To understand the effects of finite precision implementation on the compression performance of arithmetic coding

Practical information:

  • Students can do the lab in their own time. Scheduled 'helpdesk' sessions will be held in DPO during Michaelmas term (times will be announced on Moodle)

Full Technical Report:

Students will have the option to submit a Full Technical Report.

Booklists

The following are useful references:

  • T. Cover and J. Thomas, Elements of Information Theory, Wiley-Blackwell, 2006.
  • D. MacKay, Information Theory, Inference and Learning Algorithms, Cambridge University Press, 2003 (free electronic copy available for download)
  • T. Richardson and R. Urbanke, Modern Coding Theory, Cambridge University Press, 2008.
  • R. Blahut, Algebraic Codes for Data Transmission, Cambridge University Press, 2012. 

Examination Guidelines

Please refer to Form & conduct of the examinations.

 
Last modified: 03/09/2019 17:25