Week 8 Info: Trees and Ensembles
In week 8 we are going to begin our study of classification and regression trees, which are amongst the most common “complex” machine learning models. We will begin by learning how tree methods work, and about how to regularize them. This will lead to the study of ensemble methods, which are methods that average together large number of fit models to make a consensus or mean prediction, thereby reducing variance. This week, we will learn about Bagging, or Bootstrap Aggregation, and Random Forests, saving boosting and BART to future weeks.
You can read more about the plan for the week and the reading/resources in Module 8.
I have decided to move the due date for the Minimum Viable Product demo to after spring break. This will be reflected in brightspace soon.
Lab 5 is available and due in two weeks.