marcoslucianops / DeepStream-Yolo-Pose

NVIDIA DeepStream SDK 6.3 / 6.2 / 6.1.1 / 6.1 / 6.0.1 / 6.0 application for YOLO-Pose models
MIT License
119 stars 32 forks source link

Backend has maxBatchSize 1 whereas 2 has been requested, model_b2_gpu0_fp32.engine failed to match config params #15

Closed willosonico closed 7 months ago

willosonico commented 7 months ago

the run fails with these messages

thank you!

0:00:07.082318833 29 0xffff7cd19490 INFO nvinfer gstnvinfer.cpp:646:gst_nvinfer_logger: NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::deserializeEngineAndBackend() [UID = 1]: deserialized trt engine from :/model_b2_gpu0_fp32.engine WARNING: [TRT]: The getMaxBatchSize() function should not be used with an engine built from a network created with NetworkDefinitionCreationFlag::kEXPLICIT_BATCH flag. This function will always return 1. INFO: [Implicit Engine Info]: layers num: 4 0 INPUT kFLOAT input 3x640x640
1 OUTPUT kFLOAT boxes 25200x4
2 OUTPUT kFLOAT scores 25200x1
3 OUTPUT kFLOAT classes 25200x1

0:00:07.232329468 29 0xffff7cd19490 WARN nvinfer gstnvinfer.cpp:643:gst_nvinfer_logger: NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::checkBackendParams() [UID = 1]: Backend has maxBatchSize 1 whereas 2 has been requested 0:00:07.234135304 29 0xffff7cd19490 WARN nvinfer gstnvinfer.cpp:643:gst_nvinfer_logger: NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() [UID = 1]: deserialized backend context :/model_b2_gpu0_fp32.engine failed to match config params, trying rebuild 0:00:07.287486636 29 0xffff7cd19490 INFO nvinfer gstnvinfer.cpp:646:gst_nvinfer_logger: NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() [UID = 1]: Trying to create engine from model files WARNING: [TRT]: onnx2trt_utils.cpp:367: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32. WARNING: [TRT]: onnx2trt_utils.cpp:395: One or more weights outside the range of INT32 was clamped WARNING: [TRT]: Tensor DataType is determined at build time for tensors not marked as input or output.

Building the TensorRT Engine

willosonico commented 7 months ago

wrong repository for the issue, moved in https://github.com/marcoslucianops/DeepStream-Yolo/issues/515