Closed jolespin closed 5 months ago
Hi, thanks for your feedback!
Decision Trees are simple Gradient Boosting with a single iteration. So if you want to use Decision Trees with shap-hypetune you can use LGBMRegressor(n_estimators=1, ...)
or XGBRegressor(n_estimators=1 , ...)
All the best
I've been testing this tool for the past day or so and loving the functionality. One issue that I'm finding is that my dataset is quite large and while I'm interested in the performance of the model I am mostly interested in selecting the most robust feature set (which is dependent on the hyper parameter set). In my case, it would be extremely productive to be able to use simple learners such as
sklearn.tree.DecisionTreeRegressor
andsklearn.tree.DecisionTreeClassifier
. I understand that these do not have built-in metrics but the code can be adapted to include RMSE loss.Would this be a feature you are interested in developing?
Is the current infrastructure for shap-hypetune able to accommodate these types of simple learners?