Category | Difficulty |
---|---|
HW | 4 |
Exams | 3 |
The course is generally taught by Professor Yeuji Chi and Professor Gauri Joshi or Professor Carlee joe-wong. It is a traditional Machine Learning course which intends to give you a good theoretical understanding of ML algorithms. The course sits right between 10701 and 10-601. To be specific, it is more theoretical oriented and mathematical intensive compared to 10-601 and less intensive compared to 10-701. This course will provide you a good foundation to take advanced ML courses in your future semesters. Having basic knowledge in probability and statistics will make your life easier :)
Some of the main topics covered here are
- MLE/MAP
- Linear Regression
- Overfitting, Bias/variance tradeoff, Evaluation
- Naive Bayes / Logistic Regression
- SVM
- Nearest Neighbors
- Decision Trees
- Boosting, Random Forest
- Neural Networks
- Clustering
- EM
- Dimensionality Reduction
- Introduction to RL
- Guest lectures
There are 7 HWs, a midterm and a Final. The HW 7 is generally bonus and you are allowed to drop score of 1 HW.
Attend classes regularly. Assignments will be conceptual based and have potential areas were you might get stuck, so plan for TA hours accordingly. The HW's are supposed to be harder, and if you make sure you understand each problem, then that should be sufficient to score good on the midterm and final exam. The exams are much easier than the HW, so don't worry if you feel you are not performing well at the start. Professors are super helpful, so make sure you get your doubts clarified.
HW 5 will be coding intensive where you have to write neural network model from scratch in python, so start earlier if you are not familiar with coding in python or object oriented programming in general.
Tip: HW 6 is supposed to be conceptually and theoretically the most heaviest, so keep some late/grace days for that HW (Former course TA's advice :P)
You can check out the previous year's material here: https://www.andrew.cmu.edu/course/18-661/