Open ordicker opened 1 year ago
Do you know if your model loads using onnxruntime from C or python?
import onnx
onnx.load("<path>/model.onnx")
works
Ok, thanks, so it is likely a bug in this package! I don't have enough time to work on this. If you feel like doing this, here is one approach to pinpoint the bug:
thanks a lot for digging into this! I never looked into the python implementation, but I am also curious how it works.
Hi nice package!
I'm trying to add new ops to ONNX.jl, and I use this package to test if the onnx file is valid (loadable and return the right results). I'm using Onnx backend test suit and I think where is a bug on this package, but I'm not sure.
Here is a minimal working example: That's a simple model for the "test_min_int16" model.zip. Basically, it's
min(x,y)
graph wherex,y
areINT16
.And I get this:
I think that should work.
For "test_min_int32" is it working. (the whole test suit)