2. Supervised Learning

Ensemble Methods — Quiz

Test your understanding of ensemble methods with 5 practice questions.

Read the lesson first

Practice Questions

Question 1

Which of the following ensemble methods typically involves training multiple models sequentially, where each subsequent model attempts to correct the errors of the previous ones?

Question 2

In a Random Forest, if we have a total of $P$ features and we randomly select $p$ features at each split ($p < P$), what is the primary benefit of this random feature selection?

Question 3

Consider a dataset with $N$ samples. If we create $k$ bootstrap samples for a Bagging algorithm, what is the approximate probability that a specific sample from the original dataset will NOT be included in a given bootstrap sample?

Question 4

Which of the following is a key advantage of using Random Forests over a single, unpruned decision tree, particularly in terms of model stability?

Question 5

In a Gradient Boosting Machine (GBM), what is the primary objective of each new tree added to the ensemble?