CAIC-AD / YOLOPv2

YOLOPv2: Better, Faster, Stronger for Panoptic driving Perception
MIT License
540 stars 67 forks source link

Inference times too high #32

Closed lucalazzaroni closed 1 year ago

lucalazzaroni commented 1 year ago

Hi everyone,

I tried the demo with the pre-trained model on a NVIDIA A4000 GPU giving as input a 560x349 png image. Results are impressive from a quality perspective, but what concerns me the most are the inference times. Indeed, I get these values:

inf : (1.1654s/frame)   nms : (0.0007s/frame)
Done. (1.197s)

I also noticed that if I reduce or increase the input size the inf value remains pretty stable... I'd like to use this network in an online manner but with these inference times it would be very difficult. Am I doing something wrong?

Thank you in advance for your help!

lucalazzaroni commented 1 year ago

A little update. I noticed that if I use the cpu by passing the --device cpu argument, results slightly improve. Here is an example:

inf : (0.3076s/frame)   nms : (0.0007s/frame)
Done. (0.394s)

So, is the demo not exploiting the gpu correctly?

abhigoku10 commented 1 year ago

@lucalazzaroni just a thought can u check how much gpu and cpu memories are getting occupied in this so that gives up an understanding of the issue faced

sgzqc commented 1 year ago

Please check this three blogs: 1) YOLOPv2 2) YOLOP 3) HybridNets

lucalazzaroni commented 1 year ago

Sorry for the late update. The issue still persists, similarly to https://github.com/CAIC-AD/YOLOPv2/issues/39. Is it maybe related to the pytorch version? I made the same steps explained in the referenced issue.