oandrienko / fast-semantic-segmentation

ICNet and PSPNet-50 in Tensorflow for real-time semantic segmentation
220 stars 41 forks source link

Help: Convert ICNET_0.5 to onnx file #28

Closed SiR0N closed 4 years ago

SiR0N commented 4 years ago

Hello,

I want to transform this TF model: ICNET_0.5 to onnx and I followed this example: ConvertingSSDMobilenetToONNX

I understood if I just want to inference I should use the frozen graph (frozen_inference_graph.pb) so I changed the name to savel_model.pb (it seems that tf2onnx does not recognize other name) and run the following with this error:

C:\Users\esarojp\Desktop\newmodel\0818_icnet_0.5_1025_resnet_v1.tar> python -m tf2onnx.convert --opset 10 --fold_const --saved-model .\0818_icnet_0.5_1025_resnet_v1\saved_model\ --output MODEL.onnx

 - WARNING - From C:\Users\esarojp\AppData\Local\Continuum\anaconda3\lib\site-packages\tf2onnx\verbose_logging.py:72: The name tf.logging.set_verbosity is deprecated. Please use tf.compat.v1.logging.set_verbosity instead.

Traceback (most recent call last):
  File "C:\Users\esarojp\AppData\Local\Continuum\anaconda3\lib\runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "C:\Users\esarojp\AppData\Local\Continuum\anaconda3\lib\runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "C:\Users\esarojp\AppData\Local\Continuum\anaconda3\lib\site-packages\tf2onnx\convert.py", line 161, in <module>
    main()
  File "C:\Users\esarojp\AppData\Local\Continuum\anaconda3\lib\site-packages\tf2onnx\convert.py", line 123, in main
    args.saved_model, args.inputs, args.outputs, args.signature_def)
  File "C:\Users\esarojp\AppData\Local\Continuum\anaconda3\lib\site-packages\tf2onnx\loader.py", line 103, in from_saved_model
    meta_graph_def = tf.saved_model.loader.load(sess, [tf.saved_model.tag_constants.SERVING], model_path)
  File "C:\Users\esarojp\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\util\deprecation.py", line 324, in new_func
    return func(*args, **kwargs)
  File "C:\Users\esarojp\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\saved_model\loader_impl.py", line 269, in load
    return loader.load(sess, tags, import_scope, **saver_kwargs)
  File "C:\Users\esarojp\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\saved_model\loader_impl.py", line 422, in load
    **saver_kwargs)
  File "C:\Users\esarojp\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\saved_model\loader_impl.py", line 349, in load_graph
    meta_graph_def = self.get_meta_graph_def_from_tags(tags)
  File "C:\Users\esarojp\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\saved_model\loader_impl.py", line 327, in get_meta_graph_def_from_tags
    "\navailable_tags: " + str(available_tags))
RuntimeError: MetaGraphDef associated with tags 'serve' could not be found in SavedModel. To inspect available tag-sets in the SavedModel, please use the SavedModel CLI: `saved_model_cli`
available_tags: [set()]

and when I run:

C:\Users\esarojp\Desktop\newmodel\0818_icnet_0.5_1025_resnet_v1.tar> saved_model_cli show --dir .\0818_icnet_0.5_1025_resnet_v1\saved_model\ --tag_set serve  --signature_def serving_default
Traceback (most recent call last):
  File "C:\Users\esarojp\AppData\Local\Continuum\anaconda3\Scripts\saved_model_cli-script.py", line 10, in <module>
    sys.exit(main())
  File "C:\Users\esarojp\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\tools\saved_model_cli.py", line 909, in main
    args.func(args)
  File "C:\Users\esarojp\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\tools\saved_model_cli.py", line 621, in show
    _show_inputs_outputs(args.dir, args.tag_set, args.signature_def)
  File "C:\Users\esarojp\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\tools\saved_model_cli.py", line 133, in _show_inputs_outputs
    tag_set)
  File "C:\Users\esarojp\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\tools\saved_model_utils.py", line 120, in get_meta_graph_def
    ' could not be found in SavedModel')
RuntimeError: MetaGraphDef associated with tag-set serve could not be found in SavedModel

Any idea of what is wrong?

oandrienko commented 4 years ago

Hey @SiR0N, thanks for your interest in the project. Unfortunately, I haven't played around with ONNX or their conversion script. However, I think your problem might be related to using a Frozen Graph Def as a SavedModel (since you mention you just rename). Although they are similar, the formats are different.

Taking a quick look - the conversion tool seems to allow a Frozen Graph Def as an input, you just need to modify the flags you're using. Or, you can always convert the the model to a SavedModel.

Hope this helps! Will close this issue for now but let me know if you have any issues with the checkpoints.

SiR0N commented 4 years ago

Hello! thanks for your reply, it was really useful, I thought that a SavedModel and Frozen were the same, I should have checked deeply the tool before asking.

I could convert it with this: python -m tf2onnx.convert --graphdef .\0818_icnet_0.5_1025_resnet_v1\frozen_inference_graph.pb --output frozen.onnx --fold_const --opset 10 --inputs inputs:0 --outputs predictions:0

I attach the file in case you want to have a look. (Have not checked it yet) ICNET_0.5.onnx.zip

oandrienko commented 4 years ago

Awesome I'm glad that it was helpful! The input and output tensors look correct so sounds like it would work. Good luck!

sctrueew commented 4 years ago

@SiR0N Hi,

How can I use the onnx model in C++ or python? Can you share the evolution code?

SiR0N commented 4 years ago

Hi @zpmmehrdad , unfortunately, I have no access to the code anymore. I could make it running in python, check this for pre/postprocessing: https://modeldepot.io/oandrienko/icnet-for-fast-segmentation The only part to change is the inference, to do that I would recommend having a look here (step 3): https://github.com/microsoft/onnxruntime/blob/master/docs/python/tutorial.rst in the sess.run you should include your inputs and output names, to do so open the onnx file with netron: netron. It is really useful when working with onnx, it shows all the info about the computational graph.

Hope it helps you!