Closed TGpastor closed 1 year ago
ONNX does not support this operator, but does not affect TensorRT conversions and inference.
thank you, my supervisor just told me that I need to deploy bevformer in different kinds of Embedded development boards, so I think I need to convert pth to onnx by me self,still thank you for your excellent work,though I can't use it.(by the way, could you please tell me where can I find the Network structure file for bevformer? The official checkpoints are saved as dict and I can't convert them to onnx)
Maybe you can create the model and load the dict file with load_state_dict
, then save it with pickle.
thanks, I know I need to use "model.load_state_dict()", but I just don't know where to import the model or how to built it(I think bevformer project saved the model structure as file,so I can just import the model from file.But I can't find it, or I should built the whole model structure before I use "model.load_state_dict()"?)
You can try to save the model in tool/bevformer/evaluate_pth.py
before inference.
It works, thank you for helping me so much, now the last thing is use "torch.onnx.export" to start the convert, but I need to input data with correct shape and type, I tried to input a tensor in shape of (1,6,3,736,1280), but it says "TypeError: img_metas must be a list, but got <class 'torch.Tensor'>"(I used "bevformer_small.py" as the config file to built the model), maybe I should use tuple to input the data , could you please tell me the shape and type for the input data?
The shapes are in bevformer_small_trt.py
. You can try sample/bevformer/small/pth2onnx.sh
.
But I remember you said your onnx model"can only be used as an intermediate model for the tensorrt engine.",so can I use "bevformer_small.py" to build the model and use the shapes in "bevformer_small_trt.py"?
bevformer_small.py
cannot be converted to ONNX but 'bevformer_small_trt.py'. You can try to implement the unsupported ops of the ONNX file in ONNXRuntime like tensorrt plugins.
Thanks, I'll try it later.
sorry to bother you again, I finally evaluate the trt model,now I want to save the result and visualize it, how can I done it?
thank you, I mean, I want to visualize the bbox result of the evaluation( to draw the bbox on the picture), I try to use "mmcv.dump(bbox_results['bbox_results'], '/root/autodl-tmp/BEVFormer/results_nusc.json')"to save the bbox, but it says "list indices must be integers or slices, not str", how can I correctlly get the bbox result from the model and draw it on the picture?
Maybe you can use the nusenes tool to parse it. But I don't know about it.
when I use"sh samples/bevformer/base/pth2onnx.sh -d ${gpu_id}",it warns"the shape inference of mmcv::MMCVModulatedDeformConv2d type is missing",how can I fix it?(it can still finish the convert, but I guess the structure of modle will be influenced)