bitsy-ai / rpi-object-tracking

Object tracking tutorial using TensorFlow / TensorFlow Lite, Raspberry Pi, Pi Camera, and a Pimoroni Pan-Tilt Hat.
https://medium.com/@grepLeigh/real-time-object-tracking-with-tensorflow-raspberry-pi-and-pan-tilt-hat-2aeaef47e134
MIT License
178 stars 70 forks source link

Export detections (bounding-box coordinates, classfication...) for detection/tracking #62

Open careyer opened 2 years ago

careyer commented 2 years ago

Description

Searching for a way on how to export the bounding box coordintes for the detection / tracking. I want to evaluate where in the video objects are detected. Is there a way to export that information in order to process it somewhere else (e.g. by piping the output of rpi-deep-pantilt to anyother program/process? It would be awesome if this could work with and without edge-tpu. (P.S: I noticed that the console output with edge-tpu is much less informative than without).

Other than that I'd like to trigger some action if a specific object is detected with a probability threshold >xx% , e.g. if a bird is detected trigger a Deterrent system.

Thank you very much fo the great project!