victordibia / handtrack.js

A library for prototyping realtime hand detection (bounding box), directly in the browser.
https://victordibia.com/handtrack.js/
MIT License
2.84k stars 252 forks source link

in How to remove the post processing part of the object detection model graph during conversion #21

Closed sushanth-d closed 4 years ago

sushanth-d commented 4 years ago

Hey. I am facing the following issue during conversion of fozengraph model to .js format.

In your (awesome) blog you mentioned that

I followed the suggestion by authors of the Tensorflow coco-ssd example [2] in removing the post-processing part of the object detection model graph during conversion.

So I went to that link and executed the command in the readme file :

tensorflowjs_converter --input_format=tf_frozen_model \
                       --output_format=tfjs_graph_model \
                       --output_node_names='Postprocessor/ExpandDims_1,Postprocessor/Slice' \
                       ./frozen_inference_graph.pb \
                       ./web_model

But with my frcnn - inception v2 model I am getting this error:

KeyError: "The name 'Postprocessor/Slice' refers to an Operation not in the graph."

And when I am trying to convert SSD_MobileNet_V2, I am getting this error:

ValueError: Unsupported Ops in the model before optimization
NonMaxSuppressionV5

tensorflowjs version 1.3.2 tensorflow version 1.15.0

Any help would be much, much appreciated. Thank you.

sushanth-d commented 4 years ago

Downgrading the TF version from 1.15.0 to 1.14.0 while training the model fixed this issue. Then I was able to convert the model into tfjs format without any error.