Open aidevmin opened 10 months ago
It creates the engine with dynamic batch-size. But you need to set the batch-size
in the config infer file to the OPT/MAX batch-size value to create the correct engine.
@marcoslucianops Thanks. Do you mean change this line? https://github.com/marcoslucianops/DeepStream-Yolo/blob/9bda315ee0834ca0fb2d7f6b5f34c0a69ddc24e0/config_infer_primary.txt#L10 I will try it and inform to you.
Yes, it needs to be the OPT/MAX batch-size.
Until now, code to convert from darket Yolov4 weight to engine does not support dynamic batch size. I tried with dynamic batch size to measure inference time by using trtexec, but it still set input size 1x3x508x608, it is not correct. Please add this part for flexible usage. Thanks.