rapidsai / cuml

cuML - RAPIDS Machine Learning Library
https://docs.rapids.ai/api/cuml/stable/
Apache License 2.0
4.19k stars 527 forks source link

[FEA] Extra-trees training in cuML #3063

Open tzemicheal opened 3 years ago

tzemicheal commented 3 years ago

Is your feature request related to a problem? Please describe. SKlearn provides training and inference for ExtraTrees regression/classification. https://scikit-learn.org/stable/modules/generated/sklearn.ensemble.ExtraTreesRegressor.html

from sklearn.ensemble import ExtraTreesRegressor
reg = ExtraTreesRegressor(n_estimators=100, random_state=0).fit(
   X_train, y_train)
reg.score(X_test, y_test)

Describe the solution you'd like It would be great to have ExtraTreeRegressor/ExtraTreeClassifier as part of cuML to use for various ensemble learning tasks.

teju85 commented 3 years ago

Just collecting more info here...

@tzemicheal any ideas if we should also support bootstrap and max_samples hyper-params as supported in sklearn? Because the original paper on ET explicitly says the following: ... and that it uses the whole learning sample (rather than a bootstrap replica) to grow the trees.

github-actions[bot] commented 3 years ago

This issue has been marked rotten due to no recent activity in the past 90d. Please close this issue if no further response or action is needed. Otherwise, please respond with a comment indicating any updates or changes to the original issue and/or confirm this issue still needs to be addressed.

github-actions[bot] commented 3 years ago

This issue has been labeled inactive-30d due to no recent activity in the past 30 days. Please close this issue if no further response or action is needed. Otherwise, please respond with a comment indicating any updates or changes to the original issue and/or confirm this issue still needs to be addressed. This issue will be labeled inactive-90d if there is no activity in the next 60 days.

Somaya-Alshare commented 3 years ago

Hi @tzemicheal, First of all pardon my ignorance I am totally new to GitHub, I used to download repos and work on them locally for myself but never contributed or communicated with others here before. can I work on this issue or the work has already been done? I guess since it is an open issue that means it still need work right?

beckernick commented 2 years ago

Hi @Somaya-Alshare , we always welcome contributions! Do you have experience working with CUDA / C++?

showkeyjar commented 1 month ago

how is it going?