Open paviddavid opened 5 years ago
Thank you for your post. We noticed you have not filled out the following field in the issue template. Could you update them if they are relevant in your case, or leave them as N/A? Thanks. What is the top-level directory of the model you are using Have I written custom code OS Platform and Distribution TensorFlow installed from TensorFlow version Bazel version CUDA/cuDNN version GPU model and memory Exact command to reproduce
@tensorflowbutler Updated the post by adding the missing information.
I am getting the same error with the same folder structure of trained model, but method is different.
graph_def.ParseFromString(graph_content)
I have got a similar error, try using the frozen_inference_graph.pb instead of the saved_model.pb. This worked for me. The things is though they are similar files, the internal format is different.
I faced the same error as @paviddavid in the method MergeFromString
.
@annie-surla I'm unable to use export_inference_graph.py
to generate frozen_inference_graph.pb
. The author @pkulzc in issue #8844 suggests that:
placeholder and frozengraph are all TF1 features. In TF2 you should use SavedModel. See exporter_main_v2.py
So I'm under impression that SavedModel consists of the feature of frozen inference graph in TF2.
I'm trying to use infer_detections.py
to generate a detection TF record. I have to supply a frozen inference graph (or a Saved Model in TF2) as a parameter to infer_detections.py
and I got the same error when I supplied saved_model.pb
to infer_detections.py
as the argument for --inference_graph
. Does anyone know a better approach to obtain a detection TF record perhaps?
@yang07ly I am seeing the exact same error, do you find a way to solve it?
No I have not. It seems to be a TF2 feature still in progress for the authors. I edited the source code to print the data I wanted though.
Hello, I am encountering the same error right now, have you guys found a solution about this?
Hello all,
I am currently using the Tensorflow Object Detection API to train a new model and test it with the KITTI data (as they are provided in within the API).
What have I done? First, I downloaded the pre-trained ResNet101 model (trained on Kitti data) from the model zoo and I ran the inference and everything worked as expected.
However, running the inference on a fine-tuned model, I obtain the following error:
I use this command to do the inference:
Does anybody know why this error appear? Could it be the case that the error is located within the structure of how the model is saved? The downloaded pre-trained Resnet101 (trained on Kitti data) is organized in the following data structure
However, the data folder that is created in my Output Folder of the training is organized in as follows:
Am I using the wrong pb file ? I do not see a frozen_inference_graph.pb file (or similar). Does anybody have a hint how I could solve this?
Thanks a lot in advance and kind regards