Open TheMattBin opened 1 week ago
Looks like a type error in the model. Do you have any instructions for exporting the model, to repro this? Looks like somebody is trying to do a cumulative-sum over boolean values produced using the Equality operator, but CumSum doesn't support boolean. So, I guess a cast from boolean to Int32 or Int64 may be required. Could be an exporter/converter bug.
Hi, thanks for the reply! I actually use Huggingface optimum to export my onnx model, below would be the script,
Describe the issue
After I export the onnx model with huggingface optimum, I run the following script to check my model which has no issue.
However, when I tried to run inference with onnxruntime, I got the following error:
To reproduce
Environment: Google Colab
Inference code:
Urgency
No response
Platform
Web Browser
OS Version
Google Colab
ONNX Runtime Installation
Built from Source
ONNX Runtime Version or Commit ID
1.19.2
ONNX Runtime API
Python
Architecture
X64
Execution Provider
Default CPU
Execution Provider Library Version
No response