Open Annanapan opened 6 months ago
I add:
update_registered_converter(
XGBClassifier,
"XGBoostClassifier",
calculate_linear_classifier_output_shapes,
convert_xgboost,
options={"nocl": [True, False], "zipmap": [True, False, "columns"]}
)
and the error dissappears, but the predictions are different from the raw predictions using pipelines directly
Is this the right way to prepare the input for prediction using onnx?
X_test_inputs = {c: X_test[c].values for c in X_test.columns}
for c in num_features:
v = X_test[c].dtype
if v == "float64":
X_test_inputs[c] = X_test_inputs[c].astype(np.float32)
for k in X_test_inputs:
X_test_inputs[k] = X_test_inputs[k].reshape((X_test_inputs[k].shape[0], 1))
You should follow this tutorial to register a XGB model: https://onnx.ai/sklearn-onnx/auto_tutorial/plot_gexternal_xgboost.html.
When converting the pipeline to onnx, I met the error:
The pipeline code contain a preprocessor and a XGB decision tree model, I created is as followed:
The error occurs when converting the pipeline. I researched that all the steps in preprocessor are acceptable. I wonder whether it's onnx that cannot deal with complex transformers.
versions: skl2onnx: 1.16.0 sklearn: 1.4.0 Python: 3.11.7