Open melyux opened 2 months ago
same here on windows, but when changing to large size inferenes going up 200ms+.
@quinti123 Check your logs if it's trying to load the large module, not finding the .tflite it's looking for, and falling back to CPU. That's what happens when I try to switch to Large on 2.6.5. Maybe 2.8.0 is doing the same thing (and maybe hiding the logs if you don't see it).
Actually went back to 2.6.5 and it still won't let me switch to EfficientDet. It pretends to be using it but gives the same exact results as MobileNet. So apparently we've been fake using other modules for a long time now.
Is CPAI dead?
Is CPAI dead?
No, see below Chris has been updating the code for a new release.
Area of Concern
Describe the bug Changing the model and model size for the Coral no longer works on CPAI 2.8 and Coral module 2.4.0. You're stuck on MobileNet Small, no matter what you pick.
It looks like it's working because the Info tab shows the selected model and model size, but the inferences remain identical to MobileNet Small no matter what you pick. For example, even after choosing YoloV5 Large, the inferences still take ~13ms, which is the typical return time of MobileNet Small.
Someone else has also noticed and posted the bug on CPAI Discussions.
Expected behavior It should change the model and model size properly.
Your System (please complete the following information):