The Hebrew University Logo
Syllabus INTRODUCTION TO MACHINE LEARNING - 67577
עברית
Print
 
PDF version
Last update 18-10-2018
HU Credits: 5

Degree/Cycle: 1st degree (Bachelor)

Responsible Department: Computer Sciences

Semester: 2nd Semester

Teaching Languages: Hebrew

Campus: E. Safra

Course/Module Coordinator: Dr. Matan Gavish

Coordinator Email: gavish at cs huji ac il

Coordinator Office Hours: Wed 12:00-13:00

Teaching Staff:
Dr. Matan Gavish
Mr. Gadi Mintz
Mr. Gilad Green
Mr.
Mr. Asaf Yehoodai

Course/Module description:
This is an introductory course to the field of machine learning. The course will cover the foundations of statistical learning, and the applicability of machine learning to real world problems. In particular, we'll focus on the PAC model, and will address fundamental questions like: What is machine learning? What type of concepts are learnable? How can we learn from data? We will also build a machine learning toolbox and will also cover additional models of learning such as online learning, unsupervised learning, clustering, generative models and parameter estimation. Besides the theoretical foundations, we will cover tools which were found efficient in solving practical problems. In particular: Decision trees, deep learning, SVM and kernel methods, Lasso, Nearest Neighbor, Boosting, PCA, Perceptron, Weighted Majority. The course will include theoretical exercises as well as empirical projects.

Course/Module aims:
Understand the foundation of learning theory and the major algorithms

Learning outcomes - On successful completion of this module, students should be able to:
define PAC learning.
employ algorithms learnt in class.
choose the appropriate algorithm for a given problem.
prove basic results in the theory of learning.

Attendance requirements(%):
0

Teaching arrangement and method of instruction: lectures, recitations, home exercises

Course/Module Content:
Probability: review
Measure Concentration
Introduction and Gentle Start
A formal Learning Model
PAC Model
Learning Via Uniform Convergence
The Bias-Complexity Tradeoff
No-Free-Lunch
VC-dimension
Linear Predictors
Boosting
SVM
Deep neural networks
Validation
MDL and SRM
Convex Optimization
Convex Learning Problems
Stochastic Gradient Descent
Regularized loss minimization
Ridge Regression
SVM
Kernels
Decision Trees
Nearest Neighbor
Online Learning
Clustering
Dimensionality Reduction
Spectral Clustering
Generative Models

Required Reading:
N.A

Additional Reading Material:
1. Shai Shalev-Shwartz and Shai Ben-David, Understanding Machine Learning: From Theory to Algorithms. Cambridge University Press

2. Jerome Friedman, Robert Tibshirani, and Trevor Hastie, The Elements of Statistical Learning 2nd Edition. Springer

Course/Module evaluation:
End of year written/oral examination 90 %
Presentation 0 %
Participation in Tutorials 0 %
Project work 0 %
Assignments 10 %
Reports 0 %
Research project 0 %
Quizzes 0 %
Other 0 %

Additional information:
N.A
 
Students needing academic accommodations based on a disability should contact the Center for Diagnosis and Support of Students with Learning Disabilities, or the Office for Students with Disabilities, as early as possible, to discuss and coordinate accommodations, based on relevant documentation.
For further information, please visit the site of the Dean of Students Office.
Print