The Hebrew University Logo
Syllabus INTRODUCTION to INFORMATION THEORY - 80663
עברית
Print
 
close window close
PDF version
Last update 20-08-2017
HU Credits: 3

Degree/Cycle: 2nd degree (Master)

Responsible Department: Mathematics

Semester: 2nd Semester

Teaching Languages: Hebrew

Campus: E. Safra

Course/Module Coordinator: Dr. Zemer Kosloff

Coordinator Email: zemer.kosloff@mail.huji.ac.il

Coordinator Office Hours:

Teaching Staff:
Dr. Zemer Kosloff

Course/Module description:
This course is an introduction to information theory and its applications in other mathematical disciplines (measure concentration, probability theory) as well as in Physics and Engineering.

In the first part of the course we will treat the classical theory as was introduced in Shannon's seminal paper. In this part we will show how probability can be used to model some classical problems such as data compression and channel coding (fixed length and variable length) and introduce the basic concepts such as entropy, Fisher information.

The second part of the course will be concerned with differential entropy, Gaussian channels and applications of information theory to other disciplines.

Course/Module aims:

Learning outcomes - On successful completion of this module, students should be able to:
Understand how information problems are modeled and solved using elementary mathematical tools.
Get familiar with theoretical methods which appear in information theoretical problems, in particular probabilistic methods and information calculus.

Attendance requirements(%):
0

Teaching arrangement and method of instruction: Lecture

Course/Module Content:
-- Introduction and presentation of the basic concepts such as entropy, relative entropy, mutual information.

--Shannon's channel coding theorem;

--Variable length coding and block data compression;

-- Lempel Ziv algorithm.
Noisy channels

-- Network Information Theory: The multiple access channels; Slepian-Wolf Lemma for multiple access channels.

-- Further topics to be decided based on the background of the class: linear codes, differential entropy, the Donsker and Varadhan variational description of entropy, information theoretic proof of the central limit theorem.

Required Reading:
--

Additional Reading Material:
T M Cover and J A Thomas, Elements of information theory, Wiley interscience 1991.

R. Ash R B Ash. Information Theory, Dover Publications, 1990

Yury Polyanskiy and Yihong Wu. Lecture notes of the MIT course on information theory. Available for download at the MIT open courseware website.

O. Johnson. Information theory and the central limit theorem. Imperial College Press, London, 2004.

Course/Module evaluation:
End of year written/oral examination 90 %
Presentation 0 %
Participation in Tutorials 0 %
Project work 0 %
Assignments 10 %
Reports 0 %
Research project 0 %
Quizzes 0 %
Other 0 %

Additional information:
 
Students needing academic accommodations based on a disability should contact the Center for Diagnosis and Support of Students with Learning Disabilities, or the Office for Students with Disabilities, as early as possible, to discuss and coordinate accommodations, based on relevant documentation.
For further information, please visit the site of the Dean of Students Office.
Print