Open rafikg opened 5 years ago
I'm having the same issue on a 2080ti. The GPU is only 20-28% used and memory is only 50% used (5487 MB). My inference time is .2 to .18s i.e. 5 to 6 FPS. I think it's using a batch size of 1. Not sure how to increase the batch size. I tried increasing IMS_PER_BATCH from 3 to 16 or 32 in the config yaml file but that doesn't do anything.
@therobotprogrammer, I think you should forget about this project. It is not real time at all. Me and more than 2 persons tried it and we got too low frame rate....
@nvrv @ralpguler I installed correctly the densepose project and passing all the tests. I write a infer_simple_webcam to run the code using a webcam. with 240x320 image (as indicated in the original paper), I get a very low frame rate
~5-6
fps. However, in the original paper, they obtained20-26
fps. Note: I had these warning when running the code: