ibm-aur-nlp / PubLayNet

Other
900 stars 165 forks source link

Batch Inference Time Calculation #28

Open jmandivarapu1 opened 4 years ago

jmandivarapu1 commented 4 years ago

Hi Team, I have a question regarding the run time of publaynet during Inference time or test time. Right now I see arguments take one Image at a time. I tried passing multiple Images to arguments and processing one at a time and it works.

But I am curious and want to know how to process a batch of images at a time like may be batch of 128 or 64 ..etc.?

Can anybody estimate like time for 100 Images if processed as a batch?

Currently it's taking 14 Minutes for 1000 Images detection of sections in Images publaynet detections if process one by one on Tesla K 80 GPU or CPU as it is in evaluation mode

or does it follow detectron 2 which can process 8 images per second for training but not sure during test time?