The Hebrew University Logo
Syllabus INTRODUCTION to INFORMATION THEORY - 80663
עברית
Print
 
PDF version
Last update 10-02-2021
HU Credits: 3

Degree/Cycle: 2nd degree (Master)

Responsible Department: Mathematics

Semester: 2nd Semester

Teaching Languages: Hebrew

Campus: E. Safra

Course/Module Coordinator: Dr. Zemer Kosloff

Coordinator Email: zemer.kosloff@mail.huji.ac.il

Coordinator Office Hours: By appointment

Teaching Staff:
Dr. Zemer Kosloff

Course/Module description:
This course is an introduction to information theory and its applications in other mathematical disciplines (measure concentration, probability theory) as well as in Physics and Engineering.

In the first part of the course we will treat the classical theory as was introduced in Shannon's seminal paper. In this part we will show how probability can be used to model some classical problems such as data compression and channel coding (fixed length and variable length) and introduce the basic concepts such as entropy, divergence and Fisher information.

The second part of the course will be concerned with applications of these methods to statistical physics, probability theory (large deviations), optimal transport and combinatorics.

The third part, time permitting will deal with differential entropy and Barron's entropic proof of the CLT.

Course/Module aims:
Gain familiarity with the field of information theory and how it connects to other mathematical disciplines.

Learning outcomes - On successful completion of this module, students should be able to:
Understand how information problems are modeled and solved using elementary mathematical tools.
Get familiar with theoretical methods which appear in information theoretical problems, in particular probabilistic methods and information calculus.

Attendance requirements(%):
0

Teaching arrangement and method of instruction: Lecture

Course/Module Content:
-- Introduction and presentation of the basic concepts such as entropy, relative entropy, mutual information.

--Method of types

--Shannon's channel coding theorem;

--Variable length coding and block data compression;

-- Lempel Ziv algorithm.
Noisy channels

-- Network Information Theory: The multiple access channels; Slepian-Wolf Lemma for multiple access channels.

-- Large deviations

-- Gibbs measures

-- Optimal transport

-- Stationary channels

-- (Time permitting) Differential entropy and the CLT

Required Reading:
--

Additional Reading Material:
T M Cover and J A Thomas, Elements of information theory, Wiley interscience 1991.

R. Ash R B Ash. Information Theory, Dover Publications, 1990

Yury Polyanskiy and Yihong Wu. Lecture notes of the MIT course on information theory. Available for download at the MIT open courseware website.

O. Johnson. Information theory and the central limit theorem. Imperial College Press, London, 2004.

Course/Module evaluation:
End of year written/oral examination 60 %
Presentation 0 %
Participation in Tutorials 0 %
Project work 0 %
Assignments 40 %
Reports 0 %
Research project 0 %
Quizzes 0 %
Other 0 %

Additional information:
 
Students needing academic accommodations based on a disability should contact the Center for Diagnosis and Support of Students with Learning Disabilities, or the Office for Students with Disabilities, as early as possible, to discuss and coordinate accommodations, based on relevant documentation.
For further information, please visit the site of the Dean of Students Office.
Print