jw3126 / ONNXRunTime.jl

Julia bindings for onnxruntime
MIT License
45 stars 9 forks source link

Can't load simple model (with 8bit and 16bit inputs) #28

Open ordicker opened 1 year ago

ordicker commented 1 year ago

Hi nice package!

I'm trying to add new ops to ONNX.jl, and I use this package to test if the onnx file is valid (loadable and return the right results). I'm using Onnx backend test suit and I think where is a bug on this package, but I'm not sure.

Here is a minimal working example: That's a simple model for the "test_min_int16" model.zip. Basically, it's min(x,y) graph where x,y are INT16.

import ONNXRunTime as OX
OX.load_inference("model.onnx") #<test_dir>/data/node/test_min_int16/model.onnx

And I get this:

ERROR: Could not find an implementation for Min(13) node with name ''

I think that should work.

For "test_min_int32" is it working. (the whole test suit)

jw3126 commented 1 year ago

Do you know if your model loads using onnxruntime from C or python?

ordicker commented 1 year ago
import onnx
onnx.load("<path>/model.onnx")

works

jw3126 commented 1 year ago

Ok, thanks, so it is likely a bug in this package! I don't have enough time to work on this. If you feel like doing this, here is one approach to pinpoint the bug:

  1. Load the model from C
  2. If that works use the low-level API of this package ONNXRunTime.CAPI to 1:1 translate the C code to julia, see the example in the README.
ordicker commented 1 year ago

It is in onnxruntime :(

Issue

Now, I'm not sure how the python version works

jw3126 commented 1 year ago

thanks a lot for digging into this! I never looked into the python implementation, but I am also curious how it works.