Open playground opened 3 years ago
Can you try to reshape like this tensor = tensor.reshape([1, 224, 224, 3]); before you call predict ?
Yes I did that too, that didn't help.
Also with tfjs-node v3.8.0 seems to have broken the functionality for loadSavedModel() for MacOS. Looks like a regression bug, I have posted that here https://github.com/tensorflow/tensorflow/issues/38260/#issuecomment-881928290
In order to expedite the trouble-shooting process, please provide a code pen example or sample code to reproduce the issue. Thanks!
Basically this is what I did tfnode.loadGraphModel(modelUrl); const image = fs.readFileSync(imageFile); let decodedImage = tfnode.node.decodeImage(image);
tried both predict and executeAsync
I can share the converted model.json if that will help.
sure, if possible a working codepen example or a git repo. Thank you
Here is the converted model.json https://github.com/playground/tfjs-object-detection/tree/main/model-json-service/model
@rthadur v3.8.0 seems to have broken for Mac for loadSavedModel, model.predict() is throwing this error
Error: Session fail to run with error: Cannot parse tensor from proto: dtype: DT_VARIANT
tensor_shape {
}
variant_val {
type_name: "tensorflow::TensorList"
metadata: "\001\000\001\377\377\377\377\377\377\377\377\377\001\030\001"
}
[[{{node StatefulPartitionedCall/StatefulPartitionedCall/map/TensorArrayV2_1/_9__cf__9}}]]
This issue has been automatically marked as stale because it has not had recent activity. It will be closed in 7 days if no further activity occurs. Thank you.
Any update?
Any update?
I meet the same issue, but the different thing is when I use tf.browser.fromPixels(imgPath)
, I got the shape 3(it's a JPG file), not 4(as the doc says it should be 4). the img path is an image URL that stores in the cloud.
The same thing is that when I use the model.execute
or model.predict
, I got the Uncaught (in promise) Error: The shape of dict['input_tensor'] provided in model.execute(dict) must be [1,-1,-1,3], but was [711,474,3]
error.
Hi, @playground
Apologize for the delayed response and we're re-visiting our older issues and checking whether those issues got resolved or not as of now so May I know are you still looking for the solution or your issue got resolved ?
If issue still persists after trying with latest version of TFJs please let us know with error log and code snippet to replicate the same issue from our end ?
Could you please confirm if this issue is resolved for you ? Please feel free to close the issue if it is resolved ? Thank you!
This issue has been automatically marked as stale because it has not had recent activity. It will be closed in 7 days if no further activity occurs. Thank you.
Yes, this issue still exists, please keep this open until it's resolved.
Hi, @playground
Apologize for the delayed response and I was trying to replicate the same issue from my end and I'm getting below error and I have added error log below so Could you please help me with the steps to replicate the same issue from my end ?
I have tried your code snippet from here and also I used the same model from your Github repo which you converted with the help of TFJs convertor so may be I'm doing something wrong here to replicate the same issue from my end. Thank you!
gaikwadrahul-macbookpro:TFJS gaikwadrahul$ node index.js
/Users/gaikwadrahul/Desktop/TFJS/node_modules/@tensorflow/tfjs-core/dist/tf-core.node.js:25608
throw new Error(message);
^
Error: Failed to parse model JSON of response from https://github.com/playground/tfjs-object-detection/blob/main/model-json-service/model/model.json. Please make sure the server is serving valid JSON for this request.
at HTTPRequest.<anonymous> (/Users/gaikwadrahul/Desktop/TFJS/node_modules/@tensorflow/tfjs-core/dist/tf-core.node.js:25608:31)
at step (/Users/gaikwadrahul/Desktop/TFJS/node_modules/@tensorflow/tfjs-core/dist/tf-core.node.js:138:27)
at Object.throw (/Users/gaikwadrahul/Desktop/TFJS/node_modules/@tensorflow/tfjs-core/dist/tf-core.node.js:87:53)
at rejected (/Users/gaikwadrahul/Desktop/TFJS/node_modules/@tensorflow/tfjs-core/dist/tf-core.node.js:74:36)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
Node.js v18.15.0
gaikwadrahul-macbookpro:TFJS gaikwadrahul$
@gaikwadrahul8 I think you need to try the raw link of the model https://raw.githubusercontent.com/playground/tfjs-object-detection/main/model-json-service/model/model.json
This issue has been marked stale because it has no recent activity since 7 days. It will be closed if no further activity occurs. Thank you.
This issue was closed due to lack of activity after being marked stale for past 7 days.
Hi, @playground
I'm really sorry due to google-ml-butler bot this issue got closed so now I have re-opened this issue, we'll update you about this issue soon. Thank you!
Hi, @playground
Apologize for the delayed response and I tried to replicate the same issue from my end and it's giving similar error message so we'll have to dig more into this issue and will update you soon, thank you for noticing this issue I really appreciate your efforts and time. Thank you!
CC :@mattsoulanille
Here is error log output for your reference, I tried with .predict()
, .execute()
and .executeAsync()
:
gaikwadrahul-macbookpro:test-5366 gaikwadrahul$ node index.js
/Users/gaikwadrahul/Desktop/TFJS/test-5366/node_modules/@tensorflow/tfjs-core/dist/tf-core.node.js:454
throw new Error(typeof msg === 'string' ? msg : msg());
^
Error: The shape of dict['input_tensor'] provided in model.execute(dict) must be [1,-1,-1,3], but was [400,600,3]
at Object.assert (/Users/gaikwadrahul/Desktop/TFJS/test-5366/node_modules/@tensorflow/tfjs-core/dist/tf-core.node.js:454:15)
at /Users/gaikwadrahul/Desktop/TFJS/test-5366/node_modules/@tensorflow/tfjs-converter/dist/tf-converter.node.js:31349:26
at Array.forEach (<anonymous>)
at GraphExecutor.checkInputShapeAndType (/Users/gaikwadrahul/Desktop/TFJS/test-5366/node_modules/@tensorflow/tfjs-converter/dist/tf-converter.node.js:31341:29)
at GraphExecutor.execute (/Users/gaikwadrahul/Desktop/TFJS/test-5366/node_modules/@tensorflow/tfjs-converter/dist/tf-converter.node.js:30855:14)
at GraphModel.execute (/Users/gaikwadrahul/Desktop/TFJS/test-5366/node_modules/@tensorflow/tfjs-converter/dist/tf-converter.node.js:31913:36)
at GraphModel.predict (/Users/gaikwadrahul/Desktop/TFJS/test-5366/node_modules/@tensorflow/tfjs-converter/dist/tf-converter.node.js:31758:34)
at file:///Users/gaikwadrahul/Desktop/TFJS/test-5366/index.js:9:33
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
Node.js v18.15.0
gaikwadrahul-macbookpro:test-5366 gaikwadrahul$
Thanks @gaikwadrahul8 , I look forward to your update. I also have this issue sitting out there for sometime https://github.com/tensorflow/tfjs/issues/7398#issuecomment-1503170942, can you please provide an update there too?
Please make sure that this is a bug. As per our GitHub Policy, we only address code/doc bugs, performance issues, feature requests and build/installation issues on GitHub. tag:bug_template
System information MacOS Big Sur Node v15.7.0 NPM 7.20.0
I'm running @tensorflow/tfjs-node v 3.7.0
I attempted to convert my saved model to friendly model json using tensorflowjs_converter
When run model.predict(), it's throwing these errors
I have also tried model.executeAsync() instead of model.predict(), executeAsync() NOTE: When use with SavedModel(without the conversion) it works for both image types
for jpg image
for png image
If I add ----control_flow_v2=True to do the conversion, it will fail to loadGraphModel.