galliot-us / PifPaf-TensorRT-Pose-Estimation

43 stars 18 forks source link

Cannot export onnx to trt #5

Closed HoangTienDuc closed 1 year ago

HoangTienDuc commented 3 years ago

Hi, thank for your awesome work. I tried to convert onnx model to tensorrt. But tensorrt can not find your output

Parsing model
Building TensorRT engine, FP16 available:1
    Max batch size:     1
    Max workspace size: 1024 MiB
[2021-09-26 04:00:17   ERROR] Network must have at least one output
[2021-09-26 04:00:17   ERROR] Network validation failed.
terminate called after throwing an instance of 'std::runtime_error'
  what():  Failed to create object
Aborted (core dumped)