Open VanderBieu opened 19 hours ago
I think it might be the problem that SessionInput is implemented using static type. To be specific the Session Input is constructed from HashMap<K,V> and Vec<(K,V)> so the value is automatically converted to f32 or i64. My suggestion is that the input.rs should be reworked using more flexible structure.
Check the expected type of the graph's inputs using a program like Netron.
Check the expected type of the graph's inputs using a program like Netron.
I doublechecked, there is nothing wrong with input types.
How are you creating the tensors? Are you certain they are the expected dtype? (You can print Value::dtype
to check.)
I have an ONNX model which takes one float tensor and three int tensors as input. The inference session works well in python version of onnxruntime. However it cannot work with "2.0.0-rc.6" version of ort. The input is
in rust
in python
No matter how I convert float to int or int to float the inference session kept failing with
{ code: InvalidArgument, msg: "Unexpected input data type. Actual: (tensor(float)) , expected: (tensor(int64))" }