tensorflow / models

Models and examples built with TensorFlow
Other
77.16k stars 45.76k forks source link

How to convert mobilenet v2 to pb? #7919

Open Semihal opened 4 years ago

Semihal commented 4 years ago

System information

Describe the problem

I am trying to convert sd_mobile net_v2_coco and convert then to uff - > CudaEngine. The original frozen_inference_graph.pb converts perfectly, but frozen_inference_graph after work object_detection/export_inference_graph.py no. If you compare the two models by hash sums-they are different. How to properly convert checkpoint (and ckpt) to get the original frozen_inference_graph.pb?

I run the conversion like this:

export PYTHONPATH=$(realpath ../tensorflow-models/research):$(realpath ../tensorflow-models/research/slim)
INPUT_TYPE=image_tensor
PIPELINE_CONFIG_PATH=$(realpath ssd_mobilenet_v2_13/pipeline.config)
TRAINED_CKPT_PREFIX=$(realpath ssd_mobilenet_v2_13/model.ckpt)
EXPORT_DIR=$(realpath ssd_mobilenet_v2_13/export/)
python3 -m object_detection.export_inference_graph \
    --input_type=image_tensor \
    --pipeline_config_path=${PIPELINE_CONFIG_PATH} \
    --trained_checkpoint_prefix=${TRAINED_CKPT_PREFIX} \
    --output_directory=${EXPORT_DIR}
tensorflowbutler commented 4 years ago

Thank you for your post. We noticed you have not filled out the following field in the issue template. Could you update them if they are relevant in your case, or leave them as N/A? Thanks. TensorFlow version Bazel version

Semihal commented 4 years ago

@tensorflowbutler, I filled out these two fields.