Open ueihgnurt opened 5 days ago
Example code for how you're running the model in python and ORT web would be helpful.
Example code for how you're running the model in python and ORT web would be helpful. The dummy code is here. git@github.com:ueihgnurt/test_onnx_web.git
https://github.com/ueihgnurt/test_onnx_web/blob/main/python_examples/python_example.ipynb is using opencv not onnxruntime to run the model afaics so there's no direct comparison to ORT web. I would suggest checking that the pre and post processing you're doing of the input/output is correct for the onnx model.
Additionally, compare the raw float32 values produced for a given set of input values to remove any pre/post processing differences from the equation.
https://github.com/ueihgnurt/test_onnx_web/blob/main/python_examples/python_example.ipynb is using opencv not onnxruntime to run the model afaics so there's no direct comparison to ORT web. I would suggest checking that the pre and post processing you're doing of the input/output is correct for the onnx model.
Additionally, compare the raw float32 values produced for a given set of input values to remove any pre/post processing differences from the equation.
So I tried using this code
result is the same as opencv one. I didn't even process anything. The error here only occur if I add conv2d into the model. you can test the model with this https://github.com/ueihgnurt/test_onnx_web/blob/main/vueexamples/public/onnxdinov2/test_empty_model.onnx it contain no conv2d just return the origin image. in this onnx model. both the web version and the python version return the same result.
I mean what I did was only divide the RGB to 255.0 then put it into the model. How could the different be this big between web version and python version.
Describe the issue
Onnx Web Inference. Python + Onnx Inference. This is what happen if I remove the Conv2d layer from the model
As you can see the the input is right. so that the output is still right if there no conv2d layer. The problem is when I put the conv2d layer in. All of them use float32 I wonder what causing this issue
To reproduce
The Model:
Urgency
No response
ONNX Runtime Installation
Built from Source
ONNX Runtime Version or Commit ID
1.20.1
Execution Provider
'wasm'/'cpu' (WebAssembly CPU)
I don't think the result up here is related too much with the float32 since you can clearly see the conv2d literally grid origin image into 3x3 parts and positions is shuffled. it's clearly not a conv2d. or at least how I think it work.
Describe the issue
Onnx Web Inference. Python + Onnx Inference. This is what happen if I remove the Conv2d layer from the model
As you can see the the input is right. so that the output is still right if there no conv2d layer. The problem is when I put the conv2d layer in. All of them use float32 I wonder what causing this issue
To reproduce
The Model:
Urgency
No response
ONNX Runtime Installation
Built from Source
ONNX Runtime Version or Commit ID
1.20.1
Execution Provider
'wasm'/'cpu' (WebAssembly CPU)