Closed mazumdarparijat closed 4 years ago
@iglesias @karlnapf @vigsterkr ?
Sounds good but totally out of my expertise....
I like this idea, and it seems that it fits good with the fundamental ML project as it is just more stuff that can be done with decision trees. I suggest to focus on gradient boosting first.
It is also totally out of my expertise but I think we should still be able to manage.
Alright then, lets go ahead with this. Even I feel its not too difficult to implement. But since we are not experts in this, I think it would be a good idea to compare results on benchmarking datasets against R/scikit-learn?
Indeed, that is always a good idea (a requirement I'd say). No matter if we are experts or not :)
@mazumdarparijat what's the status of this issue?
@vigsterkr Stochastic gradient boosting (CStochasticGBMachine
) is already in. adaboost can also be accessed through the the same class just by supplying exponential loss function. But I think it will be good to have a derived class for adaboost. This is not done yet. Benchmarking is also not done yet.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
This issue is now being closed due to a lack of activity. Feel free to reopen it.
The idea is to incorporate gradient boosting and adaboost. These are very popularly used with decision trees to improve results. Since we have already planned to implement Random forests, I feel implementing boosted trees is a next natural step. Let me know what do you think.