microsoft / onnxjs

ONNX.js: run ONNX models using JavaScript
Other
1.76k stars 129 forks source link

compact model exported as onnx - trained on customvision.ai doesnt work #10

Open horvathj opened 5 years ago

horvathj commented 5 years ago

Hi,

I trained a model on customvision.ai with compact settings. I think it is a squeezenet model. I exported the model in onnx formet (1.0 and 1.2 too) and tried it with the https://github.com/Microsoft/onnxjs/tree/master/examples/browser/squeezenet example.

It is not working, it is not able to load the model.

await session.loadModel("./my.onnx");

Error in Windows 10 / Chrome 70.0.3538.110:

onnx.min.js:1 Uncaught (in promise) TypeError: Cannot read property 'elemType' of null at Function.t.tensorValueTypeFromProto (onnx.min.js:1) at new t (onnx.min.js:8) at t.buildGraph (onnx.min.js:8) at new t (onnx.min.js:8) at Object.from (onnx.min.js:8) at t.load (onnx.min.js:8) at onnx.min.js:8 at t.event (onnx.min.js:1) at e.initialize (onnx.min.js:8) at e. (onnx.min.js:8)

How can I load a model that was trained on customvision.ai ?

ybrnathan commented 5 years ago

Hi horvathj,

Thank you for reporting the issue. Is it possible to share your model so we could take a closer look?

horvathj commented 5 years ago

Hi,

you can download it form here: http://deep-learning.hu/onnx/tree-10-ml.onnx Test image: http://deep-learning.hu/onnx/tree-fenyo.jpg label:fenyo

(info: in customvision.ai you can train a model for single tag per image and multilable tags for image too)

thanks, Janos

ybrnathan commented 5 years ago

Hi Janos,

Thanks for providing the ONNX model. The reason this model is failed to load is because it uses ONNX ML operators (like ZipMap) which is not supported by ONNX.js yet. Here is a list of operators we are supporting so far (https://github.com/Microsoft/onnxjs/blob/master/docs/operators.md).

We will gradually add more operators along the way. Please stay tuned.

Thanks, Nathan

horvathj commented 5 years ago

Hi Nathan,

Thank you for your answer.

1) Can you modify the graph loader so it would be able to load part of the graph? The user would be able to add 2 additional optional parameters: starting and end nodes names.

This way users can leave out the end of the graph and use the softmax output.

2) Its strange Flatten is not supported, it is quite common. Reshape is supported so it wouldnt be to hard to handle.

3) In customvision onnx export I can find these operators:

-> If there would be an option to skip the end of the graph, and Flatten would be supported -> users would be able to load the graph with WebGL backend.

-> additionally if ImageScaler can be skipped in the beginning and PAD, GEMM would be supported on CPU -> users would be able to load the graph on CPU!

a) ImageScaler - Supported on WebGL but NOT supported on CPU (in my customvision.ai export maybe can be skipped? bias: 0,0,0; scale: 1 or users can handle in preprocessing?)

b) BatchNormalization - Supported c) Conv - Supported d) Relu - Supported on CPU and WebGL e) Pad - Supported on WebGL but NOT supported on CPU f) Maxpool - Supported g) Concat - Supported on CPU and WebGL h) GlobalAveragePool - Supported i) Flatten - NOT supported - (but Reshape is supported) j) Gemm - Supported on WebGL, WASM but NOT supported on CPU k) Softmax - Supported l-n) ArgMax, zipMap, ArrayFeatureExtractor - Not supported but can be skipped and handle in postprocessing.

4) Can you reach out to the customvision.ai team to collaborate on a compatible export format?

ybrnathan commented 5 years ago

Hi Janos,

Supporting customvision.ai exported onnx model is in the plan. We would like to support those models in a formal way instead of providing work around solutions. Will keep you updated with the progress.

Thanks, Nathan

MrPec commented 5 years ago

Any news on this? We also have some trained modules (compact) and tried to export them (both as 1.0 and 1.2), but we get a "Model is not valid!" message...