quic / sense

Enhance your application with the ability to see and interact with humans using any RGB camera.
https://20bn.com/products/datasets
MIT License
734 stars 107 forks source link

why are my fps rates so low #175

Closed NB-Xie closed 3 years ago

NB-Xie commented 3 years ago

Thank you for your amazing work!

I was trying to run the examples/ run_gesture_recognition.py , with cpu (i5-8400 CPU @ 2.80GHz) on win10 I thought it would be a real- time recognizer but the camera / model fps seemed to be very low: around 13 / 1.2 fps for efficientnet and around 14 / 3.5 fps for mobilenetv2.

I don't know if it is normal, do you have any suggestions ? image image

NB-Xie commented 3 years ago

My running of run_gesture_detection.py had the same issue

corneliusboehm commented 3 years ago

Hey @NB-Xie, thanks a lot for checking out our project!

The framerates you observe for mobilenet are actually very close to the target values. In the optimal case it would be running at 16fps camera input and 4 fps model output. This is specifically configured so that the models can be executed in real-time on different hardware. In our applications, the 4 predictions per second were usually frequent enough.

Efficientnet is a little more complex, so you might need a GPU to run it at the desired speed.

Thanks a lot for raising this issue again. We will try to clarify again in the README that 16 / 4fps is the expected real-time behavior.

NB-Xie commented 3 years ago

Thank you!

MoonBunnyZZZ commented 2 years ago

Thank you for your amazing work!

I was trying to run the examples/ run_gesture_recognition.py , with cpu (i5-8400 CPU @ 2.80GHz) on win10 I thought it would be a real- time recognizer but the camera / model fps seemed to be very low: around 13 / 1.2 fps for efficientnet and around 14 / 3.5 fps for mobilenetv2.

I don't know if it is normal, do you have any suggestions ? image image

Hi, do you have tje downloaded models? @NB-Xie