Open td43 opened 3 years ago
Hi, you might need to specify hparams during your inference command line:
python inspector.py \
--mode=infer \
--model_name=efficientdet-d0 \
--saved_model_dir=/home/daniel_tobon/workspace/freeze_model/efficientdet-d0_frozen.pb \
--input_image=/home/daniel_tobon/workspace/dataset/IMAGENES/009076ba-42a4-451e-abfb-cb476eaec327.jpg \
--output_image_dir=/home/daniel_tobon/workspace/ \
--hparams=/home/daniel_tobon/workspace/tfrecords/hparams_config.yaml # Add this line
Hi @mingxingtan thanks for your answer. as I mentioned at the end of the post:
If I set hparams with the frozen model, then I got the correct labels predictions. But, if the frozen model needs an extra file with the labels maps, then why in the online resources everybody says that the frozen model.pb is enough to make predictions or I am maybe misunderstanding something?
I already set hparams. My question is why the frozen model saves the coco labels and not my custom labels.
Hi there,
I have a question regarding the
frozen_model.pb
I trained a fine-tuned model with a pre-trained checkpoint with efficientdet-d0 to make inferences on my own custom dataset. I exported the frozen model with tf2/
Then I get in my output folder:
assets efficientdet-d0_frozen.pb saved_model.pb variables
According to the
tf2/README.md
there are several ways to make predictions. Using a frozen graph or construct the graph from scratch.My question is why the predictions with the frozen graph assign the labels from the COCO dataset and not my own custom labels.
result with frozen model:
with model built from scratch:
result without frozen model:
If I set
hparams
with the frozen model, then I got the correct labels predictions. But, if the frozen model needs an extra file with the labels maps, then why in the online resources everybody says that the frozen model.pb is enough to make predictions or I am maybe misunderstanding something?I would like to get some insight into this topic.