Closed DiTo97 closed 5 months ago
I assume model = BoL2EmotionV2()
?
I assume
model = BoL2EmotionV2()
?
@xadupre that's right, I forgot to mention it
I tried to replicate but it seems to work for me. The code I wrote is here: #1103. Feel free to add comment if I did something wrong.
@xadupre, appreciate the code replication.
The code you wrote works for me as well, likely didn't do a great job at explaining it in the original post.
The problem arises when we save the exported model to a .onnx file and try to instantiate the inference session from the exported file at a later time.
To be clearer, snippet 1 works, snippet 2 does not:
.
.
.
exported = to_onnx(model, ...)
modelengine = onnxruntime.InferenceSession(
exported.SerializeToString(), providers=["CPUExecutionProvider"]
)
.
.
.
exported = to_onnx(model, ...)
with open("model.onnx", "wb") as f:
f.write(exported.SerializeToString())
modelengine = onnxruntime.InferenceSession(
"model.onnx", providers=["CPUExecutionProvider"]
)
I tried to replicate but it still working for me. Saving the model on disk should not change anything. The error comes from shape_inference. This is the model I get.
Can you run the updated unit test? I need to know the version you are using as well (onnx, scikit-learn onnxmltools, lightgbm, onnxruntime)
@xadupre will replicate the unit test in our own environment and report back ASAP!
@andreaGiacolono is in charge of the experiments.
Hi @xadupre, I replicated the unit test and it gave me this error message:
/root/ (unittest.loader._FailedTest) ... ERROR
======================================================================
ERROR: /root/ (unittest.loader._FailedTest)
----------------------------------------------------------------------
AttributeError: module '__main__' has no attribute '/root/'
----------------------------------------------------------------------
Ran 1 test in 0.004s
FAILED (errors=1)
An exception has occurred, use %tb to see the full traceback.
SystemExit: True
/opt/conda/lib/python3.10/site-packages/IPython/core/interactiveshell.py:3561: UserWarning: To exit: use 'exit', 'quit', or Ctrl-D.
warn("To exit: use 'exit', 'quit', or Ctrl-D.", stacklevel=1)
There are the versions: onnx 1.16.1 scikit-learn 1.2.2 lightgbm 4.2.0 onnxmltools 1.12.0 onnxruntime 1.18.0
I forgot to ask about the version of numpy. Did you run just one test or a couple of them? It seems the way you run the test is not right. Can you do pytest <test_file>
?
The numpy version is 1.26.4, i'll try with pytest
scikit-learn using double by default. onnx uses float by default and is strongly typed but the type is used by the converter to guess the output type. You can read this https://onnx.ai/sklearn-onnx/auto_examples/plot_cast_transformer.html, inserting CastTransformer in your pipeline should fix it.
We tried to run again the export and the InferenceSession and it seems to be working. For now we can close the issue but we continue to monitor. Thank you!
:)
when converting the following model:
to ONNX with the following conversion code:
the inference session from the exported model works just fine and is well calibrated with the original model:
but loading the model from file results in the error in the title; what might be the problem?
or