rpautrat / SuperPoint

Efficient neural feature detector and descriptor
MIT License
1.89k stars 417 forks source link

Ask for TensorRT expertise of converting the superpoint model. #151

Open VincentCheng24 opened 4 years ago

VincentCheng24 commented 4 years ago

TensorRT 7.1 on Jetson AGX Xavier generates the wrong results of the node converted from an ONNX Resize op (from opset 11) which is converted from a tf.image.resize_bilinear node TensorFlow frozen graph.

Does anyone have experience in this operation? Thanks a lot.

onnx_res

onnx model and TF frozen graph

dinara92 commented 4 years ago

@VincentCheng24 can you please share your graph freezing code? I am trying to convert the model to OpenVINO IR format.

Is it similar to the following, or are your output_node_names different?:

frozen = tf.graph_util.convert_variables_to_constants(sess, sess.graph_def, ["superpoint/prob_nms", "superpoint/descriptors"])
graph_io.write_graph(frozen, './', 'inference_graph.pb', as_text=False)
jennyzhang2018 commented 4 years ago

@dinara92 if you get mo_tf.py working for superpoint model sp_v6? I tried to convert it too(I trained it and got the model.meta data) and got some error. Thanks!

dinara92 commented 4 years ago

@jennyzhang2018

@dinara92 if you get mo_tf.py working for superpoint model sp_v6? I tried to convert it too(I trained it and got the model.meta data) and got some error. Thanks!

I could not convert it using mo_tf.py model optimizer script. First, I used saved_model_dir param. with the sp_v6 and with input_shape param. =[1,480,640,1] (NHWC, Openvino standard image input format). The operation is not implemented for node "superpoint/pred_tower0/map/while/box_nms/Where". (This error comes up for both Openvino ver.2020.1 and ver.2020.4).

Then, I froze saved model into .pb (keeping all output nodes, and also only for "superpoint/prob_nms" output node), upgraded version of Openvino to ver.2020.4 (latest) --> error propagated to node "superpoint/pred_tower0/map/while/box_nms/GatherNd".

Generally, it means some operation is not implemented, so we need to cut the model and implement the rest of operations. If you manage to convert, please share your solution. Thank you

bishoymoussa commented 3 years ago

Has anyone figure how to serve Superpoint mode on OpenVino ?

saraswathi421 commented 3 years ago

After several tries of attempting to convert our checkpoint to .pb files, we were successful. However, when trying to load the .pb file to an inference script, we are getting the below error:

RuntimeError: MetaGraphDef associated with tags 'serve' could not be found in SavedModel. To inspect available tag-sets in the SavedModel, please use the SavedModel CLI: saved_model_cli available_tags: [{'serve', 'train'}]

we are not getting these issues while we are loading the publicly available saved model, can you please tell us where we are going wrong!