Closed adendek closed 6 years ago
Gradient boost trees are basically decision trees, which can be converted into many if-else statements.
Once you train your model, you can dump the tree model, and convert it into customized C++ code with if-else statements checking the feature and split values. And finally, return the leaf value when you reaches the leaf.
@adendek - just create script that parses trees dump and generates one giant function with if-else statements. That's what I did and it works.
@drag0 Can you share the script with me? PS. Did you checked the time performance of your solution? In my project the timing is a crucial issue.
@drag0 what is the status of this script?
can you send this scipt to me? thank you. my email is byronliwei@gmail.com @drag0
@drag0 I can use that script too! Thanks ( yassine dot landa at gmail dot com)
@drag0 I would highly appreciate receiving your script to gan.sagur@gmail.com. Thanks!!
Can you please share the script to me also :+1: carsonwolf@gmail.com
please share the script to me also kaishengy@gmail.com thanks a lot!
I'll push script code to GitHub today/tomorrow evening so you'll be able to use it. Sorry everyone for waiting so long.
Here it is guys: https://github.com/drag0/xgb2cpp
@drag0 you rock! Thanks.
Is there any other method to use xgboost classifier in C++ project? For the cpp model is too long to compile in tx2. Thanks
Let's assume I created and trained some DNN model (via python api) and then I want to deploy it into some standalone c++ project. How can I do it? Can you give me any advices?