Open manu13008 opened 4 hours ago
Guten Tag, Hans here! 🍻
Thanks for reporting your issue. It sounds like you have a few challenges with your Yolo model's latency and low confidence scores. However, I notice that you haven't included any logs that would help mrousavy diagnose the problem further.
For troubleshooting, please provide logs from your React Native app. You can gather logs on Android using adb logcat
or from Xcode for iOS. This information is crucial for pinpointing the issue accurately.
Keep us updated, and we look forward to your logs!
Note: If you think I made a mistake, please ping
@mrousavy
to take a look.
Hi all,
I exported a generic (and trained) Yolov8n model into tflite format and loaded in a react native app (no expo).
After I have understood the output format, I have been trying to execute a real time inference but I have been facing 2 issues :
I have a very long latency despite the fact my model weight is only 6mo. The inference time is about 200ms (which is long but explained by the image size of 640 I guess) but what is the weirdest part is that the camera is freezing during much more than that time. For comparison, I have been using also the efficientDet model from the example and it worked fine in real time with very low latency. I actually have no idea what could cause that issue.
Sorry if this is not completely related to this repo but it might be. My confidence score from the outputs are very very low (0.0000123) and consequently not exploitable. I suspect a wrong input frame during the inference frame which could explain this low score as i'm pretty confident about what I record with my camera. Any insights about what I could possibly do wrong in that case ?
Here is the code :
My return jsx :
Thanks for the help!