onnx / tutorials

Tutorials for creating and using ONNX models
Apache License 2.0
3.4k stars 631 forks source link

mxnet-model-export is no longer supported. #212

Open quantum-fusion opened 4 years ago

quantum-fusion commented 4 years ago

Bug Report

If the model conversion is failing for a tutorial in this repo, report the bug here. However, if the bug is related to general model conversion, please go to the appropriate converter repo.

Describe the bug

Please describe the bug clearly and concisely.

ONNXMXNetServer.ipynb script has an error on mxnet-model-export

See error message mxnet-model-export is no longer supported. Please use model-archiver to create 1.0 model archive. For more detail, see: https://pypi.org/project/model-archiver ls: squeezenet.model: No such file or directory

quantum-fusion commented 4 years ago

See [9] in Jupiter notebook. !mxnet-model-export --model-name squeezenet --model-path . !ls -l squeezenet.model

vinitra-zz commented 4 years ago

After poking around a bit, I see that mxnext-model-export API has changed to multi-model-server API. You will need to translate the command to the API here: https://github.com/awslabs/multi-model-server

Here's an example with the SSD model that uses the model-archiver tool to enable the same functionality as the previous API.

https://github.com/awslabs/multi-model-server/blob/abf6e9ad2869af3a5fe83b24a7096e1eead8ae4a/examples/ssd/README.md

Hope that helps to unblock! Thanks for reporting the bug.

quantum-fusion commented 4 years ago

@vinitra The problem is that I need to be able to accept a generic .ONNX model from the Model Zoo (https://github.com/onnx/models) and then convert it into .MAR format in order for the multi-model-server to accept it.

The procedure used to convert the .ONNX file is here (https://github.com/awslabs/multi-model-server/blob/f8845f917187957d0ae44c6a44b8fecec6746811/model-archiver/docs/convert_from_onnx.md), and it discloses that it is expecting a signature.json and synset.txt file for the .ONNX archiver to convert to .MAR properly.

You can see by inspecting any model in the model zoo that they only offer the .ONNX model files from the ONNX model zoo.

See MobileNet as an example that offers only a .ONNX model (https://github.com/onnx/models/tree/master/vision/classification/mobilenet/model)

It does not have any signature.json or synset.txt file for the mobileNet model.

Do you have any idea on how to generate these files if they are not offered? I don't know how to generate them.

quantum-fusion commented 4 years ago

The big picture is I am using the Microsoft tensorflow to onnx converter, and I can produce a onnx model, however the other artifacts are not produced.

If I can convert onnx to mar format then I can model host with a custom onnx model that originates from tensorflow.

The problem is I do not have a utility to produce stated artifacts and the examples you provided only download the artifacts from amazon, however this is an invalid assumption, because the artifacts from the onnx model zoo are only onnx format, and not mar format.

Please explain on how to generate the artifacts not simply download them.