Closed un1crom closed 4 years ago
This issue has been automatically marked as stale because it has not had recent activity. It will be closed in 7 dyas if no further activity occurs. Thank you.
Closing as stale. Please @mention us if this needs more attention.
I got the same error.
My Scenerio: I trained a pre-trained model from tensorflow model zoo using transfer learning using tensorflow api as saved model (model.pb file) and converted it into tfjs format (model.json and shared .bin files).
When I tried running this model.json on the javascript(web), it gives the below error:
Uncaught (in promise) Error: The dtype of dict['input_tensor'] provided in model.execute(dict) must be int32, but was float32
When I tried someone else's working converted model (model.json and shared .bin files) on my javascript(web), it worked.
Conclusion: There is something wrong with my converted model. I converted it using tensorflowjs_converter. My original model in (model.pb) works accurately in python too.
I'm still trying out to convert my model.pb file with different tensorflowjs_converters as it seems to be the converters versioning issue.
Same problem for me when trying to run the TensorFlow SavedModel Import Demo with a custom ssd mobilenet v2 model.
@parthlathiya42 it worked for me with the following cast:
const integerTensor = imageToPredict.toInt();
return this.model.executeAsync(integerTensor); //execute() doesn't work for me here
System information
Used a stock example script provided in TensorFlow.js;
TensorFlow.js installed from (npm or script link):
!pip install tensorflowjs
TensorFlow.js version 4.0.0;
Google Chrome Version 107.0.5304.87 (Official Build) (64-bit);
Describe the current behavior After the conversion, the input data type changed from float32 to int32
Describe the expected behavior The data format remains the same
Standalone code to reproduce the issue
!tensorflowjs_converter \
--input_format=tf_saved_model \
--output_format=tfjs_graph_model \
--signature_name=serving_default \
--saved_model_tags=serve \
/content/gdrive/MyDrive/customTF2/data/inference_graph/saved_model \
Other info / logs Include any logs or source code that would be helpful to
TFmodel.signatures['serving_default'].output_dtypes
{'detection_anchor_indices': tf.float32, 'raw_detection_scores': tf.float32, 'detection_classes': tf.float32,
'num_detections': tf.float32, 'raw_detection_boxes': tf.float32, 'detection_boxes': tf.float32,
'detection_multiclass_scores': tf.float32, 'detection_scores': tf.float32}
Colab link
Has there been a fix or bypass for this issue. I created a model using transferred learning from the modelzoo/mask-rcnn model.
TensorFlow.js version
"@tensorflow/tfjs-node": "^2.0.1" "@tensorflow/tfjs": "^2.0.1"
Browser version
Nodejs and expressjs, no browser
Describe the problem or feature request
In using a LOCALLY downloaded and served model.... in the NPM/node module the index.js file requires me to cast a tensor to an int32 or it errors out about float32 no matter what I pass it.
Code to reproduce the bug / link to feature request
model loading
info on the tensor
the error out
I manually changed offending code to... and everything runs