Open Semihal opened 4 years ago
Thank you for your post. We noticed you have not filled out the following field in the issue template. Could you update them if they are relevant in your case, or leave them as N/A? Thanks. TensorFlow version Bazel version
@tensorflowbutler, I filled out these two fields.
System information
Describe the problem
I am trying to convert sd_mobile net_v2_coco and convert then to uff - > CudaEngine. The original frozen_inference_graph.pb converts perfectly, but frozen_inference_graph after work object_detection/export_inference_graph.py no. If you compare the two models by hash sums-they are different. How to properly convert checkpoint (and ckpt) to get the original frozen_inference_graph.pb?
I run the conversion like this: