HU Credits:
5
Degree/Cycle:
1st degree (Bachelor)
Responsible Department:
Computer Sciences
Semester:
1st and/or 2nd Semester
Teaching Languages:
English and Hebrew
Campus:
E. Safra
Course/Module Coordinator:
Prof. Yedid Hoshen
Coordinator Office Hours:
TBD
Teaching Staff:
Dr. gabriel satanovsky, Prof. roy schwartz, Prof. Yedid Hoshen, Mr. michael joseph
Course/Module description:
This is an introductory course to the field of machine learning. The course will cover the foundations of statistical learning, and the applicability of machine learning to real world problems. In particular, we will address fundamental questions like: What is machine learning? What and how can we learn from data? We will also build a machine learning toolbox and will also cover additional models of learning such as unsupervised learning, clustering, generative models and representation learning. Besides the theoretical foundations, we will cover tools which were found useful in solving practical problems. In particular: Decision trees, deep learning, SVM, Nearest Neighbor, Boosting, PCA, Weighted Majority, convolution neural networks, recurrent neural networks, and transformers. The course will include theoretical exercises as well as empirical projects.
To complete the course exercises, students would need to purchase a subscription to Google Colab Pro for 2 months. The current cost (Oct 24) is 10$ a month. Students in need off financial assistance can apply for it.
For special requests regarding enrollment, please fill out this form:
https://forms.gle/2owuGQGuwHHfc5717
Course/Module aims:
Understand the foundation of learning theory and the major algorithms
Learning outcomes - On successful completion of this module, students should be able to:
define PAC learning.
employ algorithms learnt in class.
choose the appropriate algorithm for a given problem.
prove basic results in the theory of learning.
Attendance requirements(%):
0
Teaching arrangement and method of instruction:
lectures, recitations, programming labs, home exercises, hackathon
Course/Module Content:
A formal Learning Model
PAC Model
The Bias-Complexity Tradeoff
No-Free-Lunch
VC-dimension
Linear Predictors
Boosting
SVM
Deep neural networks
Validation
Stochastic Gradient Descent
Regularized loss minimization
Ridge Regression
Decision Trees
Nearest Neighbor
Clustering
Dimensionality Reduction
Spectral Clustering
Convolutional neural networks
Recurrent Neural networks
Transformers
Ethical aspects of machine learning
Generative Models
Required Reading:
N.A
Additional Reading Material:
1. Shai Shalev-Shwartz and Shai Ben-David, Understanding Machine Learning: From Theory to Algorithms. Cambridge University Press
2. Jerome Friedman, Robert Tibshirani, and Trevor Hastie, The Elements of Statistical Learning 2nd Edition. Springer
Grading Scheme :
Written Exam % 80
Submission assignments during the semester: Exercises / Essays / Audits / Reports / Forum / Simulation / others 20 %
Additional information:
N.A
|