google-coral / tflite

Examples using TensorFlow Lite API to run inference on Coral devices
https://coral.withgoogle.com
Apache License 2.0
182 stars 68 forks source link

Error loading and runninig ssdlite MobileDet from official source #41

Closed leoliak closed 3 years ago

leoliak commented 3 years ago

While using detection example with ssd mobilenet v2 everything works fine, when i load ssdlite mobiledet and using int8 quantized model, i get the following error:

Traceback (most recent call last):
  File "detect_image.py", line 130, in <module>
    main()
  File "detect_image.py", line 109, in main
    objs = detect.get_output(interpreter, args.threshold, scale)
  File "/home/pi/coral/tflite/python/examples/detection/detect.py", line 153, in get_output
    count = int(output_tensor(interpreter, 3))
TypeError: only size-1 arrays can be converted to Python scalars

On the other hand, when using fp32, the example code is running but the detection result is false:

----INFERENCE TIME----
Note: The first inference is slow because it includes loading the model into Edge TPU memory.
578.49 ms
477.05 ms
498.25 ms
493.09 ms
551.80 ms
-------RESULTS--------
vase
  id:     85
  score:  0.6933819055557251
  bbox:   BBox(xmin=41, ymin=35, xmax=521, ymax=591)

The image is the same using for SSD MobileNet v2 example for object detection.

The ssdlite MobileDet was downloaded from the following site:

https://coral.ai/models/
Tongtong-allure commented 3 years ago

How to address the question, please?

manoj7410 commented 3 years ago

@leoliak Is this issue still reproducible ?

manoj7410 commented 3 years ago

Closing this due to lack of activity. Feel free to reopen if the issue still persists.