UCL Home Page
Home Admissions Students Alumni Research Business People Help
 

 


| STUDENTS > Information Theory |

Information Theory

Note: Whilst every effort is made to keep the syllabus and assessment records correct for this course, the precise details must be checked with the lecturer(s).


Code: GI12 (Also taught as: 4059)
Year:MSc
Prerequisites:A strong background in university-level mathematics
Term: 1
Taught By: Robert Smith (100%)
Aims: The purpose of the course is threefold. First, to review the mathematical concepts which will be used in the subsequent part of the course. Second, to introduce the elements of information theory and illustrate their relevance in AI and machine learning. Third, to present applications of information theory in AI, especially machine learning and pattern recognition. NOTE: This is a core course for the MSc Intelligent Systems, and an option course for the MSc Vision, Imaging and Virtual Environments.
Learning Outcomes:Students will have a basic knowledge of information theory and will understand the application of information theory to AI.

Content:

Part 1 (6 Lectures): Review of Mathematical Concepts.Basic probability, including probability functions, random variables, expectation, conditional expectation, and statistical inequalities.
Linear algebra, including vector spaces, matrices, eigenvalues and eigenvectors.
Part 2 (15 Lectures): Introduction to the elements of information theory and illustration of their relevance in AI and machine learning.Entropy, mutual information, relative entropy and relative mutual information, data compression, source coding, random codes, types, noisy channels, information capacity, and Shannon theorems. The exposition of these ideas will be illustrated by means of several examples.
Part 3 (9 Lectures): applications of information theory in AI, especially machine learning and pattern recognition. Learning prototypes and vector quantization, gaussian processes and Bayesian inference, elements of VC-theory, feature selection and mutual information, learning decision trees, maximum entropy discrimination.

Method of Instruction:

Lecture presentations, problem-solving coursework, discussions within class time and diredted self-study using the course texts.

Assessment:

The course has the following assessment components:

  • Written Examination (2.5 hours, 80%)
  • Coursework Section (1 piece, 20%)
To pass this course, students must:
  • Obtain at least 40% on the coursework component
  • Obtain an average of at least 50% when the coursework and exam components of a course are weighted together
The examination rubric is:
Answer three questions out of four. All questions carry equal marks.

Resources:

T.M. Cover and J.A. Thomas. Elements of Information Theory, Wiley, 1991.

Other suggested items are: McEliece, R. J. (1977) The Theory of Information and Coding: A Mathematical Framework for Communication. Reading, Mass.: Addison-Wesley.

Golomb, S. W., Peile, R. E., and Scholtz, R. A. (1994) Basic Concepts in Information Theory and Coding: The Adventures of Secret Agent 00111 . New York: Plenum Press.

Hamming, R. W. (1986) Coding and Information Theory. Englewood Cliffs, NJ: Prentice-Hall, 2nd edition.

A.I. Khinchin. Mathematical foundation of information theory. Dover Pub. Inc., 1957.

D. MacKay. Information Theory, Pattern Recognition and Neural Networks, 2003. Available at: http://www.inference.phy.cam.ac.uk/mackay/itprnn/course.html

Lecture notes

MSc Intelligent Systems Homepage

 
Last updated: 14 July, 2006 Maintained by Jill Saunders