Open linxi-1214 opened 3 years ago
It worked well with the kangaroo-dector
, but not workered with web_model
.
and tree of my web_model
directory like:
-rw-r--r-- 1 qing staff 4194304 3 26 16:19 group1-shard1of5.bin
-rw-r--r-- 1 qing staff 4194304 3 26 16:19 group1-shard2of5.bin
-rw-r--r-- 1 qing staff 4194304 3 26 16:19 group1-shard3of5.bin
-rw-r--r-- 1 qing staff 4194304 3 26 16:19 group1-shard4of5.bin
-rw-r--r-- 1 qing staff 1647884 3 26 16:19 group1-shard5of5.bin
-rw-r--r-- 1 qing staff 376285 3 26 16:19 model.json
the size of group1-shard5of5.bin
is not same with kangaroo-detector
, is it the reason?
Hi @linxi-1214, not all TensorFlow operations are supported so some models can be incompatible with TensorFlow.js — See this list for which ops are currently supported.
The reason why the mobile-net is used is that all the model operations are compatible with tensorflow.js. it looks like you're using a different model containing an incompatible operation.
Hi @hugozanini , I have encountered the same issue but working on your model and data (I followed your tutorial on tensorflow blog). It's a very strange behaviour because your model from github works like a charm but my model crashes during conversion with log: ValueError: Unsupported Ops in the model before optimization TensorListReserve, TensorListFromTensor, TensorListStack
How comes that exact same step by step solution results in different model? I would very much appriciate your help.
Hi, @TrybusRafalJan that's really strange.... I guess it could be a different choice that you made during the model conversion to the tf.js layers format. Do you mind sharing how you choose the parameters?
Hi @hugozanini , thanks for your answer. I manage to solve this issue. It was caused by the mix of minor errors.(versions incompatibility ) Finally I made sure that I was running the latest tfjsconverter (in virtual env) and the latest tfjs library in my npm project and all went well. Best regards
hi @TrybusRafalJan , I am trying to reproduce the results or checking the inference on server using the provided json files in kangaroo directory but there are no predictions or bboxes showing, can you please share the json or model files and steps to test inference on local webcam?
The same problem (Unknown op 'TensorListFromTensor') occurs to me when following your tutorial from https://towardsdatascience.com/custom-real-time-object-detection-in-the-browser-using-tensorflow-js-5ca90538eace does anyone know why this happens?
it turns out that the tensorflow javascript in npm need to be forced to updated to the latest version (>3.x) then this error does not occur any longer
can you provide your currently code please
I cannot whatsoever get Glitch to do any inferencing with my model. I trained it on Google Colab and it does detection when I test the saved model but I follow the steps in his guide and it doesn't work.
who was able to solve this problem ? × Unhandled Rejection (TypeError): Unknown op 'TensorListFromTensor'. File an issue at https://github.com/tensorflow/tfjs/issues so we can add it, or register a custom execution with tf.registerOp() ▶ 22 stack frames were collapsed. App._this.detectFrame D:/TFJS-object-detection/src/index.js:66 63 | 64 | detectFrame = (video, model) => { 65 | tf.engine().startScope();
66 | model.executeAsync(this.process_input(video)).then(predictions => { | ^ 67 | this.renderPredictions(predictions, video); 68 | requestAnimationFrame(() => { 69 | this.detectFrame(video, model); View compiled (anonymous function) D:/TFJS-object-detection/src/index.js:56 53 | 54 | Promise.all([modelPromise, webCamPromise]) 55 | .then(values => { 56 | this.detectFrame(this.videoRef.current, values[0]); | ^ 57 | }) 58 | .catch(error => { 59 | console.error(error);
it turns out that the tensorflow javascript in npm need to be forced to updated to the latest version (>3.x) then this error does not occur any longer
Can you elaborate on how you solved the problem