robmarkcole / coral-pi-rest-server

Perform inferencing of tensorflow-lite models on an RPi with acceleration from Coral USB stick
https://coral.ai/products/accelerator
MIT License
67 stars 20 forks source link

Can't get version 2 to run #66

Open grinco opened 2 years ago

grinco commented 2 years ago

I'm stuck with the legacy api due to inability to switch to 2.0 / pycoral. Probably has something to do with me having the m2 TPU rather than the USB stick:

~/coral-api/coral-pi-rest-server-2.0 # python3 coral-app.py --model  "ssd_mobilenet_v2_coco_quant_postprocess_edgetpu.tflite" --labels "coco_labels.txt" --models_directory "./models/"

 Initialised interpreter with model : ./models/ssd_mobilenet_v2_coco_quant_postprocess_edgetpu.tflite
 * Serving Flask app 'coral-app' (lazy loading)
 * Environment: production
   WARNING: This is a development server. Do not use it in a production deployment.
   Use a production WSGI server instead.
 * Debug mode: on
F port/default/port_from_tf/statusor.cc:38] Attempting to fetch value instead of handling error Failed precondition: Could not map pages : 10 (Device or resource busy)
Exception ignored in: <function Delegate.__del__ at 0x7fc19632f790>
Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/tflite_runtime/interpreter.py", line 125, in __del__
TypeError: item 1 in _argtypes_ has no from_param method
grinco commented 2 years ago

Just a follow-up: I am able to run the test classifier from here https://github.com/google-coral/pycoral.git

Swiftnesses commented 2 years ago

Same issue here... running Debian on Docker.

robmarkcole commented 2 years ago

hmm not sure

corystevens commented 1 year ago

For anyone else who has this issue. I was able to solve it by adding use_reloader=False to app.run. For some reason Flask is starting the application twice, which attempts to load the model a second time.

jmadden91 commented 1 year ago

Same issue, @corystevens suggestion worked perfectly