Week | Date | Content | Exercise |
---|---|---|---|
1 | 9/11 | Overview of the course | |
2 | 9/18 | 1. Statistical learning 2. Important probability and statistics concepts |
|
3 | 9/25 | Bayesian network and d-separation | Exercise 1 (due: 10/15 23:59:59) |
4 | 10/2 | 1. Naive Bayesian 2. Bayesian network modeling examples |
|
5 | 10/9 | Holiday | |
6 | 10/16 |
1. Markov Random Fields 2. Point estimation: MLE and MAP (for famous probability distributions) |
Exercise 2 (due: 10/29 23:59:59) |
7 | 10/23 | Exponential family and conjugate prior | |
8 | 10/30 | 1. Hypothesis testing 2. Midterm review |
|
9 | 11/6 | Midterm exam | |
10 | 11/13 | EM algorithm | |
11 | 11/20 | 1. Invited talk: "Intro to explanable AI" by Prof. Hao-Tsung Yang
2. Monte Carlo methods |
Exercise 3 (due: 12/3 23:59:59) |
12 | 11/27 | Variational inference | |
13 | 12/4 | 1. Monte Carlo methods 2. Inverse transform sampling, reject sampling, and importance sampling 3. Markov chain |
|
14 | 12/11 | 1. Markov Chain Monte Carlo: Metropolis-Hastings and Gibbs sampling 2. Latent Dirichlet Allocation with MCMC 3. Bayesian linear regression: closed-form vs MCMC vs VI |
Exercise 4 (due: 12/24 23:59:59) |
15 | 12/18 | 1. Gaussian process 2. Summary of the course |
|
16 | 12/25 | Final exam | |
17 | 1/1 | Holiday | |
18 | 1/8 | Flexible learning week: Bayesian neural network |