I have created the ONNX model file with implicit batch-size
python3 export_yoloV8.py -w yolov8s.pt --simplify
Whilch will use the default batch-size=1. And this ONXX file is used to create the engine file and ran the deepstream app with 1 stream. Whenever i add more streams, then i have create the ONXX file with specific batch size (batch-size=no of streams)
ONNX model generation command for two streams
python3 export_yoloV8.py -w yolov8s.pt --simplify --batch 2
Dynamic batch size is not supported with Deepstream .6.0. Please add support for this
I have created the ONNX model file with implicit batch-size
python3 export_yoloV8.py -w yolov8s.pt --simplify
Whilch will use the default batch-size=1. And this ONXX file is used to create the engine file and ran the deepstream app with 1 stream. Whenever i add more streams, then i have create the ONXX file with specific batch size (batch-size=no of streams)
ONNX model generation command for two streams
python3 export_yoloV8.py -w yolov8s.pt --simplify --batch 2
Dynamic batch size is not supported with Deepstream .6.0. Please add support for this