Open ily-R opened 1 year ago
The code to reproduce this issue does not make sense. Besides the import statements, there is only one line of code which is not commented out:
mlmodel = ct.convert("efficientdet_lite2_detection_1")
You can't pass a string literal as a model to convert.
@ily-R - please share complete code to reproduce this issue.
@TobyRoseman I am sorry if the commented lines added confusion. To reproduce the code you need to:
efficientdet_lite2_detection_1
and will have saved_model.pb
and variables
dir inside.
import coremltools as ct
mlmodel = ct.convert("efficientdet_lite2_detection_1")
Now the comment lines i shared is just to confirm that the issue is not with the downloaded model itself, those lines show that we can get predictions by loading the TF model.
import tensorflow as tf import numpy as np
imported = tf.saved_model.load("efficientdet_lite2_detection_1") model = imported.signatures["serving_default"] image = np.ones((1, 448, 448, 3), dtype = np.uint8) output = model(images=image)
The output will have be a dict with 4 values that are tensors of bounding_boxes, scores, classes, number_of_detection.
They will have the following shapes and dtype respectively:
output_0 : shape=(1, 100, 4), dtype=float32, output_1: shape=(1, 100), dtype=float32, output_2: shape=(1, 100), dtype=float32, output_3: shape=(1,), dtype=int32
Now im guessing that the problem is coming from **int32** ?
@TobyRoseman Any feedback on this ?
🐞Describing the bug
Hello,
I tried to convert the EfficientDet Lite2 model found on tensorflowhub here using the saved_model directory. So I used simple coremltools
convert()
but it crashes when Running TensorFlow Graph PassesStack Trace
To Reproduce
System environment (please complete the following information):