Open utterances-bot opened 3 years ago
Thanks again to @tobanw for reviewing the post!
Great post thanks!! How would this look if we have multi-class classification problem?
@Malthehave for multi-class classification with K>2 classes, we would fit K trees at each boosting round, where each tree is trying to nudge predicted probabilities in the right direction for its corresponding class. Check out this post for a bit more detail or check out Algorithm 6 in the original paper for the original algorithm. Maybe I'll do a scratch build in a future post.
How to Implement a Gradient Boosting Machine that Works with Any Loss Function | Random Realizations
Summarize Freedman’s seminal GBM paper and implement the generic gradient boosting algorithm to train models with any differentiable loss function.
https://blog.mattbowers.dev/gradient-boosting-machine-with-any-loss-function