CAIC-AD / YOLOPv2

YOLOPv2: Better, Faster, Stronger for Panoptic driving Perception
MIT License
544 stars 67 forks source link

Inference speed slower than results in paper #22

Closed BokyLiu closed 2 years ago

BokyLiu commented 2 years ago

I run same video using yolop and yolopv2, using demo.py which from yolop's and yolopv2's repo. The results show inference speed almost as same as each other. The results was tested on Tesla T4, yolop was 36 fps, yolopv2 was 40fps using trace model(faster because trace model?). In your preprint paper, v2 is almost twice faster than yolop.

CAIC-AD commented 2 years ago

For our demo version test, the reasoning time of yolop and yolopv2 is 0.020~0.023s, 0.011 and 0.014s respectively in tesla v100. We suggest you try the tesla v100 environment to reproduce the data in the paper (memory allocation is very important for reasoning speed. You can refer to the comparison of reasoning time between yolov5s and yolov7 in yolov7's paper). In fact, the traced wastes some time but will not appear in offical version.

Btw,You can comment out the 456th line in utils.py,that is helpful for freeing memory.Thank you for your attention.