Open asknoone opened 3 years ago
I recall Jetson is using the yolov5s model, presumably for speed. @johnolafenwa can we make this configurable?
@robmarkcole We can do this. @asknoone On the Jetson, DeepStack is most optimized for speed rather than maximum accuracy. Hence, the Jetson version has lower confidence. This default behaviour is not configurable at the moment. You can use MODE=High to improve the inference.
An option to make the jetson version use the same models as the desktop version is not existing today but we can add it in new releases
@johnolafenwa @robmarkcole thanks for the information.
Making this configurable in a future Jetson version would certainly be beneficial. I guess people's requirements are different but I would certainly be prepared to sacrifice a few hundred milliseconds to get a higher confidence level.
I have just switched to using the Jetson version (deepquestai/deepstack:jetpack) and I am finding that objects are not detected or have low confidence. I was previously (and have gone back to) using the Windows CPU version which was downloaded from the Deepstack.cc website a few months ago.
Is this a known issue or genuine reason for the differences?
I posted an example image where you can see the greatly different confidence levels from the same image: https://ipcamtalk.com/threads/tool-tutorial-free-ai-person-detection-for-blue-iris.37330/post-513231