Closed theprodev closed 6 years ago
Tensorflow publishes some pretrained object detection models, which you can find here:
In order to run these with GraphPipe, you need to download the linked .tgz file, uncompress it, and then invoke the model server with a directory mount that allows graphpipe-tf to access the frozen_interface_graph.pb file. For example:
cd /tmp/
wget http://download.tensorflow.org/models/object_detection/ssd_mobilenet_v1_coco_2018_01_28.tar.gz
tar xfvz /ssd_mobilenet_v1_coco_2018_01_28.tar.gz
docker run -it --rm \
-v "$PWD:/models/" \
-p 9000:9000 \
sleepsonthefloor/graphpipe-tf:cpu \
--model=/models/ssd_mobilenet_v1_coco_2018_01_28/frozen_inference_graph.pb \
--listen=0.0.0.0:9000
Now grab a sample image:
wget https://github.com/simo23/tinyYOLOv2/raw/master/dog.jpg
You can call the model with the following example code:
from io import BytesIO
from PIL import Image, ImageOps
import numpy as np
import requests
from graphpipe import remote
data = np.array(Image.open("dog.jpg"))
data = data.reshape([1] + list(data.shape))
pred = remote.execute_multi("http://127.0.0.1:9000", [data], ['image_tensor'],
['detection_boxes', 'detection_scores', 'num_detections',
'detection_classes'])
print("Class predictions: ", pred[-1])
The above code requests four outputs from the model, with the last being detection_classes, which is printed. Note the use of remote.execute_multi, which is a more explicit version of remote.execute. This is the output that I got:
Class predictions: [[ 18. 3. 2. 8. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1.]]
Class 18 is dog, 2 is bicycle, 3 is car, 8 is truck (https://github.com/nightrome/cocostuff/blob/master/labels.md)
Hope that helps!
Thanks a lot for the detailed reply. I will check it again soon.
Hi, I see there examples to use squeezenet but I do not see urls to use other models? I want to use coco or yolo? is that possible?
Thanks in advance.