dmitryikh / leaves

pure Go implementation of prediction part for GBRT (Gradient Boosting Regression Trees) models from popular frameworks
MIT License
419 stars 72 forks source link

xgboost consistency failed #55

Closed vanillar7 closed 5 years ago

vanillar7 commented 5 years ago

i build xgb model by python, and then run the results of test dataset. but when i use leaves to load model and predict, the results is inconsistent with python results.

and i test lgb model with the same dataset, the results are consistent.

insikk commented 5 years ago

can you post example codes so we can reproduce the inconsistent result?

dmitryikh commented 5 years ago

@vanillar7 , thanks for your report. Could you please provide more details on your case? Ideally, I would like to reproduce your steps on my side to be able to track the bug.

By the way, as noted in README.md: could be slight divergence between C API predictions vs. leaves because of floating point convertions and comparisons tolerances That means that if the feature value falls very close to the tree node's threshold value, that could lead to another decision path in the tree because of float32 (xgboost format) -> double64 (leaves format) conversions.

vanillar7 commented 5 years ago

@dmitryikh Thanks for your reminding. I had a test on Higgs dataset. as noted in README, there's only a marginal difference between the two estimates.