I have been monitoring the gpu memory usage (using nvidia-smi) in inference when running efficientdet_test_videos.py on my own webcam.
The numbers seem to be way higher than those in the readme file.
for example, the d0 model takes up 2.5 GB on my gpu even though it should take up only 1 GB according to the readme, same goes for the other models.
I have been monitoring the gpu memory usage (using nvidia-smi) in inference when running efficientdet_test_videos.py on my own webcam. The numbers seem to be way higher than those in the readme file.
for example, the d0 model takes up 2.5 GB on my gpu even though it should take up only 1 GB according to the readme, same goes for the other models.
Am i missing something?