zylo117 / Yet-Another-EfficientDet-Pytorch

The pytorch re-implement of the official efficientdet with SOTA performance in real time and pretrained weights.
GNU Lesser General Public License v3.0
5.21k stars 1.27k forks source link

GPU memory in inference #505

Open ElGondy opened 4 years ago

ElGondy commented 4 years ago

I have been monitoring the gpu memory usage (using nvidia-smi) in inference when running efficientdet_test_videos.py on my own webcam. The numbers seem to be way higher than those in the readme file.

for example, the d0 model takes up 2.5 GB on my gpu even though it should take up only 1 GB according to the readme, same goes for the other models.

Am i missing something?

zylo117 commented 4 years ago

Those numbers are not exact because pytorch allocates memory dynamically. I think 2G+ allocation is the worst case.