mlzxy / devit

MIT License
328 stars 45 forks source link

Inference speed issue #23

Open YanxingLiu opened 10 months ago

YanxingLiu commented 10 months ago

Hello! Thanks for your nice work. I am tring to run FSOD evaluation demo on coco dataset. But the inference speed of the inference phase is quite low on one single 4090 GPU. The evaluation of 5000 images needs serverial hours. I have tried to decrease the number shots, but it doesn't affect much. I find that the GPU usage is quite low (about 30%) and the memory usage is almost full. I wonder if there is a method to decrease the need of memory usage and accelerate the inference speed. Thanks again for your time.

mlzxy commented 10 months ago

Hi @YanxingLiu, it is possible to increase speed by reducing the K from 10 to 3 to get a large speed boost without much accuracy loss (and K=1 would be the fastest setup).

But I don't quite expect the low GPU usage, is it possible the data loading is slow because of other issues?

YanxingLiu commented 10 months ago

Hi @YanxingLiu, it is possible to increase speed by reducing the K from 10 to 3 to get a large speed boost without much accuracy loss (and K=1 would be the fastest setup).

But I don't quite expect the low GPU usage, is it possible the data loading is slow because of other issues?

Thanks for your reply. I am trying to reduce the numbe of shots and figure out the reason of the low GPU usage. Have a nice day!

retazo0018 commented 2 months ago

Hi @YanxingLiu,

Were you able to figure out the reason and improve the inference speed? Your answer would greatly help.

Many Thanks,