cerlymarco / shap-hypetune

A python package for simultaneous Hyperparameters Tuning and Features Selection for Gradient Boosting Models.
MIT License
567 stars 71 forks source link

[Feature Request] Support simple base learners such as DecisionTreeRegressor/Classifier #39

Closed jolespin closed 5 months ago

jolespin commented 5 months ago

I've been testing this tool for the past day or so and loving the functionality. One issue that I'm finding is that my dataset is quite large and while I'm interested in the performance of the model I am mostly interested in selecting the most robust feature set (which is dependent on the hyper parameter set). In my case, it would be extremely productive to be able to use simple learners such as sklearn.tree.DecisionTreeRegressor and sklearn.tree.DecisionTreeClassifier. I understand that these do not have built-in metrics but the code can be adapted to include RMSE loss.

Would this be a feature you are interested in developing?

Is the current infrastructure for shap-hypetune able to accommodate these types of simple learners?

cerlymarco commented 5 months ago

Hi, thanks for your feedback! Decision Trees are simple Gradient Boosting with a single iteration. So if you want to use Decision Trees with shap-hypetune you can use LGBMRegressor(n_estimators=1, ...) or XGBRegressor(n_estimators=1 , ...) All the best