Open rm-asif-amin opened 5 months ago
Sorry,i Don't know how to make the inference much more faster
Okay. It seems like the classifcation model is taking a while. Here's a breakdown -
Image loading time: 0.02506279945373535 Text detection time: 0.5832889080047607 Processing boxes time: 0.017578125 100%|██████████| 5/5 [01:22<00:00, 16.57s/it] Language classification time: 82.87071657180786 English processing time: 1.7918636798858643 Bangla processing time: 9.191173553466797
Hi, Thanks for sharing this. The architecture seems to have good accuracy but I'm getting very slow inference speed on GPU. (Changed on Easyocr and paddle settings to gpu).
For example extracting this text from an NID takes 53 seconds. Isn't ONNX supposed to be much faster(milliseconds)?