Context:
We add a lot of predictors (for example all sji/sja profiles). Additionally, OpenSTEF generates a lot of features. The RFE should be applied to the total set of input features.
How:
Investigate how model results are improved when these techniques are employed
Result:
Comparison notebook
Acceptance criteria:
[ ] Conclusion has been documented on whether or not recursive feature elimination benefits forecasting performance and explainability of the models.
Why/What: Recursive Feature Elimination, or RFE for short, is a popular feature selection algorithm. RFE is popular because it is easy to configure and use and because it is effective at selecting those features (columns) in a training dataset that are more or most relevant. See: https://scikit-learn.org/stable/modules/generated/sklearn.feature_selection.RFE.html https://machinelearningmastery.com/rfe-feature-selection-in-python/
Context: We add a lot of predictors (for example all sji/sja profiles). Additionally, OpenSTEF generates a lot of features. The RFE should be applied to the total set of input features.
How: Investigate how model results are improved when these techniques are employed
Result: Comparison notebook
Acceptance criteria: