5. Regression and Modeling

Variable Selection — Quiz

Test your understanding of variable selection with 5 practice questions.

Read the lesson first

Practice Questions

Question 1

In the context of regularization, consider the objective function for a linear regression model: $ \min_{\beta} \left\{ \sum_{i=1}^{n} (y_i - \mathbf{x}_i^T \beta)^2 + \lambda P(\beta) \right\} $. For Lasso Regression, what is the specific form of the penalty function $P(\beta)$?

Question 2

When comparing models using the Akaike Information Criterion (AIC) and the Bayesian Information Criterion (BIC), under what conditions might BIC be preferred over AIC for model selection?

Question 3

Consider a dataset with 100 observations and 50 predictors, where many predictors are highly correlated. You want to build a parsimonious model that performs automatic feature selection and handles multicollinearity effectively. Which regularization method would be most appropriate, and why?

Question 4

In the context of stepwise variable selection, if you are using backward elimination, what is the primary criterion typically used to decide which variable to remove at each step?

Question 5

Consider a linear regression model with a large number of predictors. If you observe that the model is performing poorly due to high variance (overfitting), and you want to introduce a penalty that shrinks coefficients but does not necessarily set them to zero, which regularization method would be most suitable?
Variable Selection Quiz — Statistics | A-Warded