tensorflow / tfjs

A WebGL accelerated JavaScript library for training and deploying ML models.
https://js.tensorflow.org
Apache License 2.0
18.3k stars 1.91k forks source link

Unknown op 'NonMaxSuppressionV5' #2265

Closed alien35 closed 4 years ago

alien35 commented 4 years ago

To get help from the community, we encourage using Stack Overflow and the tensorflow.js tag.

TensorFlow.js version

I'm using

Browser version

Chrome 78

Describe the problem or feature request

I'm trying to run an object detection model created in tensorflow, and I am getting this error:

Unknown op 'NonMaxSuppressionV5'.

Code to reproduce the bug / link to feature request

If you would like to get help from the community, we encourage using Stack Overflow and the tensorflow.js tag.

      const image = tf.browser.fromPixels(MY_DOM_IMAGE)
      const smalImg = tf.image.resizeBilinear(image, [368, 432]);
      const resized = tf.cast(smalImg, 'float32');
      const t4d = tf.tensor4d(Array.from(resized.dataSync()),[1,368,432,3])
      let outputs = await model.executeAsync(
      { 'image_tensor' : t4d },
      [ 'detection_boxes','detection_scores','detection_classes','num_detections', 'detection_multiclass_scores', 'raw_detection_boxes', 'raw_detection_scores']
      );

GitHub issues for this repository are tracked in the tfjs union repository.

Please file your issue there, following the guidance in that issue template.

wingman-jr-addon commented 4 years ago

@alien35 Can you please post how the model was converted, etc? On the issue just referenced #2254, I see NonMaxSuppression come across, but not NonMaxSuppressionV5 so some extra details would help (along with model or at least model.json) - otherwise it's hard to reproduce.

wingman-jr-addon commented 4 years ago

@alien35 Can you please add enough details to reproduce the issue?

pyu10055 commented 4 years ago

@alien35 can you also specify all library versions you are using, including TF and TFJS. thanks.

shnamin commented 4 years ago

Hi I also used mobileNetv3 small and trained it by quantization, then converted it to tensorflowjs by tensorflow converter. when calling the model in tfjs, I am getting the error: Uncaught (in promise) TypeError: Unknown op 'NonMaxSuppressionV4'. File an issue at https://github.com/tensorflow/tfjs/issues so we can add it, or register a custom execution with tf.registerOp() at operation_executor.ts:95 at mb (operation_executor.ts:52) at p (graph_executor.ts:362) at t.processStack (graph_executor.ts:348) at t. (graph_executor.ts:310) at callbacks.ts:253 at Object.next (callbacks.ts:253) at o (callbacks.ts:253)

The object detection model is trained with quantization, and I have converted it from the checkpoints with the commands:

python export_inference_graph.py --input_type image_tensor --pipeline_config_path /pipeline.config --trained_checkpoint_prefix /model.ckpt --output_directory

tensorflowjs_converter saved_model/ /tfjs/ --quantization_bytes 1 --input_format tf_saved_model --output_format tfjs_graph_model --skip_op_check

I am using tensorflow 1.15.0

federicolucca commented 4 years ago

Same issue for me. Same actions to convert the model. let me know

McDo commented 4 years ago

skip through operation checkings and replace post processing nms by

tensorflowjs_converter \
    --input_format=tf_frozen_model \
    --output_format=tfjs_graph_model \
    --output_node_names='Postprocessor/ExpandDims_1,Postprocessor/Slice' \
    --skip_op_check \
    ./frozen_inference_graph.pb \
    ./web_model

Then I can use cpu version of nms something like

tf.image.nonMaxSuppression(boxes2, maxScores, maxNumBoxes, 0.5, 0.5);

in the tfjs code.

Tested against tensorflow 1.15.0, works fine by far.

federicolucca commented 4 years ago

I had try:

tensorflow 1.15 tensorflowjs 1.4

I create a custom object detection from "ssdlite_mobilenet_v2_coco_2018_05_09" .i Have exported with: python object_detection/export_inference_graph.py \ --add_postprocessing_op true \ --input_type image_tensor \ --pipeline_config_path output/ai/annotations/ssdlite_mobilenet_v2_coco_2018_05_09.config \ --trained_checkpoint_prefix output/ai/training/model.ckpt-$1 \ --output_directory output/ai/exported-model and tested in python. It's works. After that i convert the model to tfjs with this:

tensorflowjs_converter \ --input_format=tf_saved_model \ --output_node_names='Postprocessor/ExpandDims_1,Postprocessor/Slice' \ --saved_model_tags=serve \ --skip_op_check \ --output_format=tfjs_graph_model \ output/ai/exported-model/saved_model \ output/ai/exported-model-web

and i have the same issue:

UnhandledPromiseRejectionWarning: TypeError: Unknown op 'NonMaxSuppressionV5'.

the code that i use in nodejs is:

const model = await cocossd.load({modelUrl:'file://output/ai/exported-model-web/model.json'}) const predictions = await model.classify(input)

*input is a tensor4d create from an image.

McDo commented 4 years ago

@federicolucca try convert from tf_frozen_model https://github.com/tensorflow/tfjs-models/tree/master/coco-ssd#technical-details-for-advanced-users

lina128 commented 4 years ago

Hi @federicolucca and @alien35, sorry for the issue you are experiencing, this is because the Op 'NonMaxSuppressionV5' was not supported in previous versions. We just added the support in 1.5.1, see details here. Please update your tensorflowjs version to 1.5.1 and try again. This should solve the problem.

federicolucca commented 4 years ago

@lina128 Don't worry , i found it and i tried to resolve it by my self with a custom version of 1.4.0 . @McDo I will try the frozen model with 1.5.1 asap Thanks

Agiledom commented 4 years ago

@lina128 I'm hitting @shnamin 's exact issue / error with NonMaxSuppressionV4.

The model is converted via the following on tfjs 1.7.4 (I have tried converting the saved model also) :-

tensorflowjs_converter --input_format=tf_frozen_model \
--output_node_names='detection_boxes,detection_classes,detection_multiclass_scores,detection_scores,num_detections,raw_detection_boxes,raw_detection_scores' \
--saved_model_tags=serve \
--skip_op_check \
--output_format=tfjs_graph_model \
gs://path/to/input/frozen_inference_graph.pb \
gs://path/to/output/tfjs_model

On the of chance this isn't a conversion issue and it's an inference related issue: I'm using the react native API and utilising the camera-with-tensors module, to run inference.

Any ideas?

lina128 commented 4 years ago

Hi @Agiledom, we can add V4 support. You can track the work here. https://github.com/tensorflow/tfjs/issues/2450 We can target next week release.

Agiledom commented 4 years ago

Hey @lina128 - that would be awesome! Thank you so much! I will track @ #2450