onnx / onnx-tensorflow

Tensorflow Backend for ONNX
Other
1.26k stars 298 forks source link

converted tf model inference issue #1033

Open FengMu1995 opened 2 years ago

FengMu1995 commented 2 years ago

Describe the bug I convert my onnx to tf saved model by onnx-tensorflow.

When I tested by onnx_tf, it runs successfully:

_# onnx_model = onnx.load("./rvm_144*256.onnx") # load onnx model

tf_rep = prepare(onnx_model) # prepare tf representation

# tf_rep.export_graph("model.tf")  # export the model
# rec = [np.zeros([1, 1, 1, 1], dtype=np.float32)] * 4  # Must match dtype of the model.
# rec[0] = np.zeros([1, 16, 72, 128], dtype=np.float32)
# rec[1] = np.zeros([1, 20, 36, 64], dtype=np.float32)
# rec[2] = np.zeros([1, 40, 18, 32], dtype=np.float32)
# rec[3] = np.zeros([1, 64, 9, 16], dtype=np.float32)
# img = cv2.imread('/home/wudi/project/Pytorch_Retinaface_1/meeting.png')
# img = np.transpose(img, (2, 0, 1)) / 255.
# src = np.expand_dims(img, 0)
# test = src.astype(np.float32)
# out = tf_rep.run([test, rec[0], rec[1], rec[2], rec[3]])
# pha = out[1]*255
# pha = np.squeeze(pha)
# im = Image.fromarray(np.uint8(pha))
# im.save("out" + ".jpg")
# print(out)_

However, when I tested by tf, the error :valueError callback pyfunc_0 is not found.

img = cv2.imread('./meeting.png') img = np.transpose(img, (2, 0, 1)) / 255. img = torch.from_numpy(img).unsqueeze(0).float() input = tf.constant(img.numpy()) rec = [tf.constant(0.)] * 4 saved_model = tf.saved_model.load("model.tf") rvm = saved_model.signatures["serving_default"] outputs = rvm(r1i=rec[0], r2i=rec[1], r3i=rec[2], r4i=rec[3], src=input)

I upload my model(model-onnx.zip/tf.zip) here.

model-onnx.zip