linghu8812 / tensorrt_inference

708 stars 208 forks source link

Network must have at least one output #163

Closed bewithme closed 2 years ago

bewithme commented 2 years ago

/app/tensorrt_inference/bin# ./tensorrt_inference mmpose ../configs/mmpose/config.yaml ../samples/detection_segmentation Could not open file ../weights/hrnet_w48_coco_256x192.onnx Could not open file ../weights/hrnet_w48_coco_256x192.onnx Failed to parse ONNX model from file../weights/hrnet_w48_coco_256x192.onnx start building engine [11/30/2022-05:56:30] [E] [TRT] Network must have at least one output [11/30/2022-05:56:30] [E] [TRT] Network validation failed. build engine done tensorrt_inference: /app/tensorrt_inference/code/src/model.cpp:46: void Model::OnnxToTRTModel(): Assertion `engine' failed. Aborted (core dumped)