Script to convert MatterPort Mask_RCNN Keras model to Tensorflow Frozen Graph and Tensorflow Serving Model.
Plus inferencing with GRPC or RESTAPI using Tensorflow Model Server.
python3 main.py
If you have a different config class you can replace the existing config in 'main.py'
# main.py
# Current config load
config = get_config()
# replace it with your config class
config = your_custom_config_class
Follow once you finish converting it to a saved_model
using the above code
saved_model.pb
in Tensorflow Model Server, using:
tensorflow_model_server --port=8500 --rest_api_port=8501 --model_name=mask --model_base_path=/path/to/saved_model/
inferencing/saved_model_config.py
. No need to change if the saved_model is the default COCO model.Then run the inferencing/saved_model_inference.py
with the image path:
# Set Python Path
export PYTHONPATH=$PYTHONPATH:$pwd
# Run Inference with GRPC
python3 inferencing/saved_model_inference.py -t grpc -p test_image/monalisa.jpg
# Run Inference with RESTAPI
python3 inferencing/saved_model_inference.py -t restapi -p test_image/monalisa.jpg
Thanks to @rahulgullan for RESTAPI client code.