androidthings / sample-tensorflow-imageclassifier

Classify camera images locally using TensorFlow models
Apache License 2.0
623 stars 186 forks source link

Custom Trained Model Not Working #20

Open meetmustafa opened 6 years ago

meetmustafa commented 6 years ago

This application is working fine with mobilenet_quant_v1_224.tflite model. I've trained custom model following Tensorflow for Poet Google Codelab and created graph using this script: IMAGE_SIZE=224 ARCHITECTURE="mobilenet0.50${IMAGE_SIZE}" python -m scripts.retrain \ --bottleneck_dir=tf_files/bottlenecks \ --how_many_training_steps=500 \ --model_dir=tf_files/models/ \ --summaries_dir=tf_files/training_summaries/"${ARCHITECTURE}" \ --output_graph=tf_files/retrained_graph.pb \ --output_labels=tf_files/retrained_labels.txt \ --architecture="${ARCHITECTURE}" \ --image_dir=tf_files/flower_photos

and for this Android Things sample to train Lite model I've followed tensorflow-for-poets-2-tflite google Codelab and converted using this script toco \ --input_file=tf_files/retrained_graph.pb \ --output_file=tf_files/optimized_graph.lite \ --input_format=TENSORFLOW_GRAPHDEF \ --output_format=TFLITE \ --input_shape=1,${IMAGE_SIZE},${IMAGE_SIZE},3 \ --input_array=input \ --output_array=final_result \ --inference_type=FLOAT \ --input_data_type=FLOAT

after capturing from raspberry pi 3 model b it is giving me this error 2018-06-28 12:13:09.115 7685-7735/com.example.androidthings.imageclassifier E/AndroidRuntime: FATAL EXCEPTION: BackgroundThread Process: com.example.androidthings.imageclassifier, PID: 7685 java.lang.IllegalArgumentException: Failed to get input dimensions. 0-th input should have 602112 bytes, but found 150528 bytes. at org.tensorflow.lite.NativeInterpreterWrapper.getInputDims(Native Method) at org.tensorflow.lite.NativeInterpreterWrapper.run(NativeInterpreterWrapper.java:98) at org.tensorflow.lite.Interpreter.runForMultipleInputsOutputs(Interpreter.java:142) at org.tensorflow.lite.Interpreter.run(Interpreter.java:120) at com.example.androidthings.tensorflow.classifier.TensorFlowImageClassifier.doRecognize(TensorFlowImageClassifier.java:99) at com.example.androidthings.tensorflow.ImageClassifierActivity.onImageAvailable(ImageClassifierActivity.java:244) at android.media.ImageReader$ListenerHandler.handleMessage(ImageReader.java:812) at android.os.Handler.dispatchMessage(Handler.java:106) at android.os.Looper.loop(Looper.java:164) at android.os.HandlerThread.run(HandlerThread.java:65)

Please help with this and i am a beginner with tensorflow.

aashutoshrathi commented 6 years ago

Same here, not able to produce a custom tflite file, rather I can only get the .lite file and this is not working.

iskuhis commented 6 years ago

Yes, I am struggling with that couple of hours but without result

aashutoshrathi commented 6 years ago

@iskuhis I fixed it you can check at https://github.com/aashutoshrathi/vision

lc0 commented 5 years ago

@iskuhis the reason is because original model for androidthings has quantized inputs, so instead of 4 bytes it has only one. You should either switch back to 4 bytes, or adapt how you export your model