WongKinYiu / CrossStagePartialNetworks

Cross Stage Partial Networks
https://github.com/WongKinYiu/CrossStagePartialNetworks
894 stars 172 forks source link

Can csmobilenetv2 use TensorRT to accelerate inference? #29

Open Damon0626 opened 4 years ago

Damon0626 commented 4 years ago

I have trained my own model csmobilenetv2(backbone) and yolov3_tiny(head) and got the best final weights. Now I want to accelerate the inference with TensorRT which verision is 5.1.6.1. But got errors when translating the weights model to onnx model. Useing yolov3.weights yolov3.cfg yolov3-tiny.weights yolov3-tiny.cfg are ok.

tensorrt:5.1.6.1 python:3.6.9 onnx:1.4.1 numpy:1.18.4

Traceback (most recent call last): File "yolov3_to_onnx.py", line 840, in main() File "yolov3_to_onnx.py", line 827, in main verbose=True) File "yolov3_to_onnx.py", line 447, in build_onnx_graph params) File "yolov3_to_onnx.py", line 322, in load_conv_weights conv_params, 'conv', 'weights') File "yolov3_to_onnx.py", line 351, in _create_param_tensors conv_params, param_category, suffix) File "yolov3_to_onnx.py", line 383, in _load_one_param_type buffer=self.weights_file.read(param_size * 4)) TypeError: buffer is too small for requested array

ngocneo commented 3 years ago

I have same problem,Do you have any solution ?