| Week | Date | Content | Exercise |
|---|---|---|---|
| 1 | 9/2 | 1. Overview of the course | |
| 2 | 9/9 | 1. Introduction to ML 2. KNN 3. k-means 4. Distance measures | Project proposal (group) (due: 9/15 23:59:59) |
| 3 | 9/16 | 1. Entropy 2. Decision tree | |
| 4 | 9/23 | 1. Matrix derivatives 2. Linear regression and regularization (Lasso, Ridge, Elastic-net) | |
| 5 | 9/30 | Logistic regression and gradient ascent | |
| 6 | 10/7 | 1. Evaluation metrics for binary classification, multi-class classification, and multi-label classification 2. ROC curve vs PR curve | |
| 7 | 10/14 | 1. Entropy, cross-entropy, and KL-divergence 2. Practical concerns on traditional machine learning 3. Ensemble learning | |
| 8 | 10/21 | 1. Gradient boosting machines 2. Linear SVM | 1. Progress report (due: 10/20 23:59:59) 2. Kaggle Competition begins (due: 11/10 23:59:59) |
| 9 | 10/28 | 1. Multi-layer perceptron 2. Convolutional neural network | |
| 10 | 11/4 | 1. Convolutional neural network 2. Recurrent neural network | |
| 11 | 11/11 | No physical class; lecture is given by recorded video 1. Kernel SVM2. Regularized linear regression and classification 3. Linear SVM with poly-2 terms vs. polynomial kernel SVM | |
| 12 | 11/18 | Word2Vec, Transformer and Large Language Model | |
| 13 | 11/25 | Associated learning | Mini-ml implementation and report (due: 11/24 23:59:59) |
| 14 | 12/2 | Attend AI Sustainability Forum (AI永續論壇) | |
| 15 | 12/9 | Constrastive learning, supervised contrastive learning and SCPL | |
| 16 | 12/16 | Diffusion model | Final project (due: 12/15 23:59:59) |