Open mingmingDiii opened 7 years ago
you are right! I have meeted the same situation.Someone can help me figure it out?
@mingmingDiii @allenwangcheng I executed several iterations of model.predict(input_img)
- only with scale 1 and shape 712x673x3. The first iteration took around 2300ms but all next iterations around 260ms (GPU 1070). I am not sure but there may be an additional cost of recompiling computation graph. I noticed that there is a latency peak every time when input image size changes (scale = multiplier[m]
).
@michalfaber Thanks for your great work!! When I test the image 'ski.jpg'(shape: 7126743), 'model.predict(input_img)' took about 1200ms with a TITAN X GPU(only with scale 1). But in the caffe version, 'output_blobs = net.forward()' only took about 72ms. Can you help me figure it out? Thanks a lot!!