# | date | topic | description |
---|---|---|---|
1 | 22-Aug-2022 | Introduction | |
2 | 24-Aug-2022 | Foundations of learning | |
3 | 29-Aug-2022 | PAC learnability | |
4 | 31-Aug-2022 | Linear algebra (recap) | hw1 released |
05-Sep-2022 | Holiday | ||
5 | 07-Sep-2022 | Linear learning models | |
6 | 12-Sep-2022 | Principal Component Analysis | project ideas |
7 | 14-Sep-2022 | Curse of Dimensionality | hw1 due |
8 | 19-Sep-2022 | Bayesian Decision Theory | hw2 release |
9 | 21-Sep-2022 | Parameter estimation: MLE | |
10 | 26-Sep-2022 | Parameter estimation: MAP & NB | finalize teams |
11 | 28-Sep-2022 | Logistic Regression | |
12 | 03-Oct-2022 | Kernel Density Estimation | |
13 | 05-Oct-2022 | Support Vector Machines | hw3, hw2 due |
10-Oct-2022 | * Mid-point projects checkpoint | * | |
12-Oct-2022 | * Midterm: Semester Midpoint | exam | |
14 | 17-Oct-2022 | Matrix Factorization | |
15 | 19-Oct-2022 | Stochastic Gradient Descent |
# | date | topic | description |
---|---|---|---|
16 | 24-Oct-2022 | k-means clustering | |
17 | 26-Oct-2022 | Expectation Maximization | hw4, hw3 due |
18 | 31-Oct-2022 | Automatic Differentiation | |
19 | 02-Nov-2022 | Nonlinear embedding approaches | |
20 | 07-Nov-2022 | Model comparison I | |
21 | 09-Nov-2022 | Model comparison II | hw5, hw4 due |
22 | 14-Nov-2022 | Model Calibration | |
23 | 16-Nov-2022 | Convolutional Neural Networks | |
21-Nov-2022 | Fall break | ||
23-Nov-2022 | Fall break | ||
24 | 28-Nov-2022 | Word Embedding | hw5 due |
30-Nov-2022 | Presentation and exam prep day | ||
02-Dec-2022 | * Project Final Presentations | * | |
07-Dec-2022 | * Project Final Presentations | * | |
12-Dec-2022 | * Final Exam | * | |
15-Dec-2022 | Grades due |
Probability theory is nothing but common sense reduced to calculation
- Pierre-Simon Laplace, 1812
Definition: Quantify the trade-offs between various classification decisions based on probabilities and the costs that accompany such decisionsAssumptions:
Gives us the knowledge of how likely we are to get salmon before we see any fish
How do we make a decision between $\omega_1$ and $\omega_2$ if all we know is the priors $\prob{P}{\omega_1}$ and $\prob{P}{\omega_2}$?
decide | if |
---|---|
salmon | $\prob{P}{\omega_1} \gt \prob{P}{\omega_2}$ |
sea bass | $\prob{P}{\omega_1} \lt \prob{P}{\omega_2}$ |
\[ \prob{P}{\omega_i|\vec{x}} = \frac{\prob{P}{\vec{x}|\omega_i}\prob{P}{\omega_i}}{\prob{P}{\vec{x}}} \]
\[ \mbox{posterior} = \frac{\mbox{likelihood}\times\mbox{prior}}{\mbox{evidence}} \]
One person in 200,000 has a particular Progressive multifocal leukoencephalopathy (PML). There is a diagnostic test for the disease. It is correct 99% of time. Your test is positive what's the probability you have PML?
Draw probability densities and find the decision regions for the following classes: