dmitryikh / leaves

pure Go implementation of prediction part for GBRT (Gradient Boosting Regression Trees) models from popular frameworks
MIT License
419 stars 72 forks source link

The prediction is wrong when using XGEnsembleFromFile to load model #86

Closed Xelawk closed 1 year ago

Xelawk commented 1 year ago

I'm using xgbModel.nativeBooster.saveModel on spark to save the native model, then by XGEnsembleFromFile loading model to predict the validation dataset, but the results are not meet the same prediction done on spark. Here are the results predicted on leaves framework:

label: 1, pred: 0.836042
label: 1, pred: 0.836042
label: 1, pred: 0.797784
label: 1, pred: 0.934794
label: 1, pred: 0.793824
label: 1, pred: 0.797579
label: 1, pred: 0.959390
label: 1, pred: 0.959390
label: 1, pred: 0.959390
label: 1, pred: 0.704733
label: 1, pred: 0.787566
label: 1, pred: 0.941911
label: 1, pred: 0.934794
label: 1, pred: 0.749724
label: 1, pred: 0.929430
label: 1, pred: 0.931993
label: 1, pred: 0.797579
label: 1, pred: 0.839686
label: 1, pred: 0.759537
label: 1, pred: 0.813373
label: 1, pred: 0.760041
label: 1, pred: 0.793824
label: 1, pred: 0.934794
label: 1, pred: 0.759537
label: 1, pred: 0.929430
label: 1, pred: 0.945538
label: 1, pred: 0.785153
label: 1, pred: 0.959390
label: 1, pred: 0.793824
label: 1, pred: 0.779831
label: 1, pred: 0.959390
label: 1, pred: 0.749724
label: 1, pred: 0.941911
label: 1, pred: 0.798052
label: 1, pred: 0.749724
label: 1, pred: 0.931993
label: 1, pred: 0.749724
label: 1, pred: 0.929430
label: 1, pred: 0.839686
label: 1, pred: 0.839686
label: 1, pred: 0.806166
label: 1, pred: 0.934794
label: 1, pred: 0.839686
label: 1, pred: 0.785153
label: 1, pred: 0.806166
label: 1, pred: 0.945538
label: 1, pred: 0.803833
label: 1, pred: 0.759537
label: 1, pred: 0.806166
label: 1, pred: 0.768660
label: 1, pred: 0.797784
label: 1, pred: 0.931993
label: 1, pred: 0.749724
label: 1, pred: 0.824530
label: 1, pred: 0.959390
label: 1, pred: 0.959390
label: 1, pred: 0.806893
label: 1, pred: 0.929430
label: 1, pred: 0.803833
label: 1, pred: 0.797148
label: 1, pred: 0.931993
label: 1, pred: 0.797579
label: 1, pred: 0.787042
label: 1, pred: 0.803833
label: 1, pred: 0.959390
label: 1, pred: 0.931993
label: 1, pred: 0.806166
label: 1, pred: 0.836042
label: 1, pred: 0.934794
label: 1, pred: 0.934794
label: 1, pred: 0.803833
label: 1, pred: 0.749724
label: 1, pred: 0.931993
label: 1, pred: 0.759537
label: 1, pred: 0.779831
label: 1, pred: 0.787042
label: 1, pred: 0.785153
label: 1, pred: 0.749724
label: 1, pred: 0.749724
label: 1, pred: 0.934794
label: 1, pred: 0.929430
label: 1, pred: 0.797579
label: 1, pred: 0.945538
label: 1, pred: 0.934794
label: 1, pred: 0.959390
label: 1, pred: 0.959390
label: 1, pred: 0.787042
label: 1, pred: 0.787042
label: 1, pred: 0.931993
label: 1, pred: 0.759537
label: 1, pred: 0.941911
label: 1, pred: 0.749724
label: 1, pred: 0.850764
label: 1, pred: 0.945538
label: 1, pred: 0.803833
label: 1, pred: 0.749724
label: 1, pred: 0.797579
label: 1, pred: 0.785153
label: 1, pred: 0.941911
label: 1, pred: 0.806166
label: 0, pred: 0.767173
label: 0, pred: 0.807510
label: 0, pred: 0.797784
label: 0, pred: 0.824530
label: 0, pred: 0.839686
label: 0, pred: 0.767173
label: 0, pred: 0.839686
label: 0, pred: 0.767176
label: 0, pred: 0.797579
label: 0, pred: 0.793824
label: 0, pred: 0.772110
label: 0, pred: 0.768660
label: 0, pred: 0.759537
label: 0, pred: 0.839686
label: 0, pred: 0.759537
label: 0, pred: 0.929430
label: 0, pred: 0.941911
label: 0, pred: 0.822525
label: 0, pred: 0.839686
label: 0, pred: 0.945538
label: 0, pred: 0.749724
label: 0, pred: 0.929430
label: 0, pred: 0.787042
label: 0, pred: 0.797579
label: 0, pred: 0.797784
label: 0, pred: 0.797784
label: 0, pred: 0.945538
label: 0, pred: 0.785153
label: 0, pred: 0.797784
label: 0, pred: 0.836042
label: 0, pred: 0.931993
label: 0, pred: 0.836042
label: 0, pred: 0.779831
label: 0, pred: 0.945538
label: 0, pred: 0.812733
label: 0, pred: 0.945538
label: 0, pred: 0.745542
label: 0, pred: 0.779849
label: 0, pred: 0.903047
label: 0, pred: 0.816076
label: 0, pred: 0.807510
label: 0, pred: 0.749971
label: 0, pred: 0.945538
label: 0, pred: 0.804371
label: 0, pred: 0.767173
label: 0, pred: 0.934794
label: 0, pred: 0.785153
label: 0, pred: 0.767173
label: 0, pred: 0.797784
label: 0, pred: 0.785153
label: 0, pred: 0.807510
label: 0, pred: 0.768660
label: 0, pred: 0.804371
label: 0, pred: 0.787042
label: 0, pred: 0.704733
label: 0, pred: 0.813373
label: 0, pred: 0.749724
label: 0, pred: 0.836042
label: 0, pred: 0.772110
label: 0, pred: 0.855798
label: 0, pred: 0.836042
label: 0, pred: 0.784896
label: 0, pred: 0.804371
label: 0, pred: 0.813373
label: 0, pred: 0.749724
label: 0, pred: 0.903047
label: 0, pred: 0.787042
label: 0, pred: 0.839686
label: 0, pred: 0.759537
label: 0, pred: 0.797579
label: 0, pred: 0.803833
label: 0, pred: 0.793824
label: 0, pred: 0.749724
label: 0, pred: 0.806166
label: 0, pred: 0.793824
label: 0, pred: 0.793824
label: 0, pred: 0.787042
label: 0, pred: 0.806166
label: 0, pred: 0.903047
label: 0, pred: 0.839686
label: 0, pred: 0.768660
label: 0, pred: 0.787042
label: 0, pred: 0.745542
label: 0, pred: 0.787042
label: 0, pred: 0.802708
label: 0, pred: 0.797784
label: 0, pred: 0.839686
label: 0, pred: 0.929430
label: 0, pred: 0.803833
label: 0, pred: 0.704733
label: 0, pred: 0.704733
label: 0, pred: 0.793824
label: 0, pred: 0.793824
label: 0, pred: 0.813373
label: 0, pred: 0.836042
label: 0, pred: 0.767173
label: 0, pred: 0.803833
label: 0, pred: 0.793824
label: 0, pred: 0.818067
label: 0, pred: 0.787566