chenjun2hao / CenterFace.pytorch

unofficial version of centerface, which achieves the best balance between speed and accuracy at face detection
236 stars 60 forks source link

Error in tensorrt inference #15

Open PankajJ08 opened 4 years ago

PankajJ08 commented 4 years ago

There is an error in post-processing of tensorrt output. I converted the model into onnx and then trt. Running on jetson nano, cuda 10, tensorrt 6, torch 1.2. File "demo_tensorrt.py", line 107, in detections = body_engine.run(rgb_img)[1] File "ML/CenterFace.pytorch/TensorRT/centernet_tensorrt_engine.py", line 58, in run predictions = self.postprocess(trt_output, meta) File "ML/CenterFace.pytorch/TensorRT/centernet_tensorrt_engine.py", line 201, in postprocess hm, wh, hps, reg, hm_hp, hp_offset = args[0]; meta = args[1] ValueError: not enough values to unpack (expected 6, got 4)

harshall28 commented 4 years ago

Hi @chenjun2hao

Can you please address this issue as I too am getting the same error.

Thanks, Harshall.

Harryqu123 commented 4 years ago

I am also getting the same issue with cuda 10.2 tensorrt 7 Does anyone solve this issue?

GilbertTam commented 4 years ago

i face the same issue. Have any update ?