After training the model, I run some inference work.And I find that it's about 1.5 second to infer a picture which is 640X360 on 1080Ti,while the CMU Openpose seem like that it's about 100ms per picture.
So I wonder this phenomenon is because of Keras,python or somethingelse?
Could you give me some advice? @kevinlin311tw
Yes. I think it is because of Keras. A possible solution is to convert the model file to a tensorflow pb file. The inference speed should be faster in this way.
Thank you for you wonderful job.
After training the model, I run some inference work.And I find that it's about 1.5 second to infer a picture which is 640X360 on 1080Ti,while the CMU Openpose seem like that it's about 100ms per picture.
So I wonder this phenomenon is because of Keras,python or somethingelse? Could you give me some advice? @kevinlin311tw