linghu8812 / tensorrt_inference

699 stars 205 forks source link

YoloV4-tiny works with batch size 64 but getting wrong boxes with batch size 1. #113

Closed Ram-Godavarthi closed 2 years ago

Ram-Godavarthi commented 2 years ago

Hello, Anyone have any idea about Yolov4 tiny model with batch size 1.

I used Yolov4 repo to generate onnx file. By default, I had batch size 64 in my cfg. It took a while to build the engine. And then inference is also as expected but it was very slow.

Then I realized I should give batch size 1 in my cfg file. Onnx generated, engine is built.

But when I run inference on some test data, I get correct classes, probs but boxes are not correct. There are way too small and placed at the top of the image everytime.

What could be the reason for this ? Should I change something (in the output calculation) when I change the batch size to 1.

Any help/ hint would be appreciated.

Thank you

Ram-Godavarthi commented 2 years ago

Solved it.

GitForKriti commented 2 years ago

@Ram-Godavarthi how did you solve it.my boxes are negative