Closed krishnak closed 4 years ago
I am trying to convert the head detection model from MXNET to ONNX using the steps mentioned in https://github.com/onnx/tutorials/blob/master/tutorials/MXNetONNXExport.ipynb
My setup is as follows
Requirement already satisfied: onnx in ./kenv/lib/python3.6/site-packages (1.6.0) Requirement already satisfied: six in /home/username/.local/lib/python3.6/site-packages (from onnx) (1.13.0) Requirement already satisfied: protobuf in ./kenv/lib/python3.6/site-packages (from onnx) (3.11.1) Requirement already satisfied: typing-extensions>=3.6.2.1 in ./kenv/lib/python3.6/site-packages (from onnx) (3.7.4.1) Requirement already satisfied: numpy in /home/username/.local/lib/python3.6/site-packages (from onnx) (1.17.4) Requirement already satisfied: setuptools in ./kenv/lib/python3.6/site-packages (from protobuf->onnx) (42.0.2)
When trying to use the code, I get the following error, any workarounds?
$ python mxconverter.py INFO:root:Converting json and weight file to sym and params [20:53:26] src/nnvm/legacy_json_util.cc:209: Loading symbol saved by previous version v1.5.0. Attempting to upgrade... [20:53:26] src/nnvm/legacy_json_util.cc:217: Symbol successfully upgraded! Traceback (most recent call last): File "mxconverter.py", line 17, in <module> converted_model_path = onnx_mxnet.export_model(sym, params, [input_shape], np.float32, onnx_file) File "/home/username/tensorflowpython/kenv/lib/python3.6/site-packages/mxnet/contrib/onnx/mx2onnx/export_model.py", line 83, in export_model verbose=verbose) File "/home/username/tensorflowpython/kenv/lib/python3.6/site-packages/mxnet/contrib/onnx/mx2onnx/export_onnx.py", line 312, in create_onnx_graph_proto checker.check_graph(graph) File "/home/username/tensorflowpython/kenv/lib/python3.6/site-packages/onnx/checker.py", line 53, in checker proto.SerializeToString(), ctx) onnx.onnx_cpp2py_export.checker.ValidationError: Node (slice_axis16) has input size 1 not in range [min=3, max=5]. ==> Context: Bad node spec: input: "softmax0" output: "slice_axis16" name: "slice_axis16" op_type: "Slice" attribute { name: "axes" ints: 1 type: INTS } attribute { name: "ends" ints: 1 type: INTS } attribute { name: "starts" ints: 0 type: INTS }
Did you find any fix for it? I am also facing the same issue
reinstall onnx==1.2.2
python3 -m pip uninstall onnx python3 -m pip install onnx==1.2.2
I am trying to convert the head detection model from MXNET to ONNX using the steps mentioned in https://github.com/onnx/tutorials/blob/master/tutorials/MXNetONNXExport.ipynb
My setup is as follows
Requirement already satisfied: onnx in ./kenv/lib/python3.6/site-packages (1.6.0) Requirement already satisfied: six in /home/username/.local/lib/python3.6/site-packages (from onnx) (1.13.0) Requirement already satisfied: protobuf in ./kenv/lib/python3.6/site-packages (from onnx) (3.11.1) Requirement already satisfied: typing-extensions>=3.6.2.1 in ./kenv/lib/python3.6/site-packages (from onnx) (3.7.4.1) Requirement already satisfied: numpy in /home/username/.local/lib/python3.6/site-packages (from onnx) (1.17.4) Requirement already satisfied: setuptools in ./kenv/lib/python3.6/site-packages (from protobuf->onnx) (42.0.2)
When trying to use the code, I get the following error, any workarounds?