Machine learning studies algorithms that build models from data for
subsequent use in prediction, inference, and decision making
tasks. Although an active field for the last 60 years, the current
demand as well as trust in machine learning exploded as increasingly
more data become available and the problems needed to be addressed
become literally impossible to program directly. In this advanced
course we will cover essential algorithms, concepts, and principles of
machine learning. Along with the traditional exposition we will learn
how these principles are currently being revisited thanks to the
recent discoveries in the field.
1. Introduction
Youtube Lectures
- Introductions 7:51
- Why Machine
Learning
12:00
- What is Machine Learning 18:57
- History of Machine
Learning 17:33
- Reinforcement Learning 10:32
- Course Overview 19:26
- The
Project
20:03
2. Foundations of learning
Youtube Lectures
- Formalizing the Problem of
Learning 24:19
- Inductive Bias 12:03
- Can We Bound the Probability of Error?
25:56
3. PAC learnability
Youtube Lectures
- Main Definitions from Lecture
2
13:52
- Agnostic PAC
Learning
53:35
- Learning via Uniform
Convergence
10:15
4. Linear algebra and Optimization (recap)
- 3Blue1Brown Playlist
5. Linear learning models
Youtube Lectures
- Linear Decision
Boundary 34:10
- Perceptron
37:10
- Perceptron
Extensions
14:09
- Linear Classifier for Linearly non Separable
Classes
8:59
6. Principal Component Analysis
Youtube lectures
- Linear
Regression
39:24
- Linear Algebra Micro
Refresher
2:04
- Spectral
Theorem
25:54
- Principal Component
Analysis
22:29
- Demonstration 17:38
7. Curse of Dimensionality
Youtube lectures
- Curse of Dimensionality 1:16:27
8. Bayesian Decision Theory
Youtube lectures
- Bayesian Decision Theory 56:47
9. Parameter estimation: MLE
Youtube Lectures
- Independence
12:07
- Maximum Likelihood Estimation 50:35
- MLE as KL-divergence
minimization 21:41
10. Parameter estimation: MAP & Naïve Bayes
Youtube Lectures
- MAP
Estimation
56:00
- The Naïve Bayes Classifier 37:09
11. Logistic Regression
Youtube Lectures
- NB to
LR
19:49
- Defining Logistic
Regression
27:42
- Solving Logistic Regression 23:35
12. Kernel Density Estimation
Youtube Lectures
- Non-parametric Density Estimation 1:13:33
13. Support Vector Machines
Youtube Lectures
- Max Margin
Classifier
35:53
- Lagrange
Multipliers
32:45
- Dual Formulation of Linear
SVM
10:34
- Kernel Trick and Soft Margin 27:28
14. Matrix Factorization
Youtube Lectures
- Matrix Factorization 1:24:22
15. Stochastic Gradient Descent
Youtube Lectures
- Stochastic Gradient Descent 1:06:57
16. k-means Clustering
Youtube Lectures
- Clustering
6:05
- Gaussian Mixture
Models
16:34
- MLE
recap
4:20
- Hard k-means
Clustering
30:27
- Soft k-means Clustering 7:18
17. Expectation Maximization
Youtube Lectures
- Do we even need EM for
GMM?
14:39
- A “hacky” GMM
estimation
15:17
- MLE via
EM 38:28
18. Automatic Differentiation
Youtube Lectures
- Introduction
25:10
- Forward Mode
AD
26:46
- A minute of
Backprop
2:26
- Reverse mode
AD
17:26
19. Nonlinear Embedding Approaches
Youtube Lectures
- Manifold
Learning
20:13
20. Model Comparison I
Youtube Lectures
- Bias Variance
Trade-Off
36:52
- No Free Lunch
Theorem
7:29
- Problems with using accuracy as performance
indicator
12:39
- Confusion
Matrix
25:15
21. Model Comparison II
Youtube Lectures
- Cross validation and
hyperopt 29:08
- Expected Value
Framework
22:48
- Visualizing Model Performance
1
31:02
- Receiver Operating Characteristics 22:34
22. Model Calibration
Youtube Lectures
- On Model
Calibration
36:53
23. Convolutional Neural Networks
Youtube Lectures
- Building
Blocks
39:22
- Skip
Connection
38:46
- Fully Convolutional
Networks
8:07
- Semantic Segmentation with
Twists
23:40
- Special Convolutions 20:15
24. Word Embedding
Youtube Lectures
- Introduction
10:35
- Semantic Matrix 30:26
- word2vec
54:22