mcb00 / blog

jekyll site for my blog
Apache License 2.0
1 stars 2 forks source link

How to Implement a Gradient Boosting Machine that Works with Any Loss Function | Random Realizations #12

Open utterances-bot opened 3 years ago

utterances-bot commented 3 years ago

How to Implement a Gradient Boosting Machine that Works with Any Loss Function | Random Realizations

Summarize Freedman’s seminal GBM paper and implement the generic gradient boosting algorithm to train models with any differentiable loss function.

https://blog.mattbowers.dev/gradient-boosting-machine-with-any-loss-function

mcb00 commented 3 years ago

Thanks again to @tobanw for reviewing the post!

Malthehave commented 1 year ago

Great post thanks!! How would this look if we have multi-class classification problem?

mcb00 commented 1 year ago

@Malthehave for multi-class classification with K>2 classes, we would fit K trees at each boosting round, where each tree is trying to nudge predicted probabilities in the right direction for its corresponding class. Check out this post for a bit more detail or check out Algorithm 6 in the original paper for the original algorithm. Maybe I'll do a scratch build in a future post.