Classical Theory of Machine Learning
INTRODUCING THE PAC PARADIGM
The course commences with a rigorous exposition pertaining to the classical theory of ML. In that, the classical Probably Approximately Correct (PAC) paradigm is introduced and through it the notion of true and empirical errors are systematically studied.
​
The key message in this part of the course is the development of techniques by which one can obtain estimates on how large a training sample fed to a leanring algorithm (model) must be in order to control the true and empirical errors of the model.
Lecture notes & videos
Classical ML Models
In this section of the course, we study some of the more classical models (learning algorithms) of ML. The notions of true and empirical errors introduced in the previous section follow us here; for each model we seek to provide estimates for the errors corresponding or associated with it.
ENG VIDEO