Open batrlatom opened 3 years ago
Are you using Tensorrt to deploy on Jetson xavier nx?
Normally yes, but not yet for your model. I do not know about any instance segmentation network which could be reasonably exported to tensorrt ( maskrcnn could, but the speedup is not huge ). Semantic segmentations like FCN and Deeplab works quite well. This is more feature request, if you would be able to take a look onto it. Usually, It could be done via pytorch -> onnx -> tensorrt. I saw that few people asked about onnx export, this is first step to go onto tensorrt.
Tensorrt would increase the speed of the model 3x. It would help to make it usable on devices like jetson nano and jetson xavier nx.
Hello, do you export solov2 to tensorrt sucessful? I'm failed!
@zzzz737 There was someone sucessfully convert a solov2-liked instance segmentation model to tensorrt:
https://zhuanlan.zhihu.com/p/341813709
request them maybe they can also convert solov2
see my repository. maybe helpful
Tensorrt would increase the speed of the model 3x. It would help to make it usable on devices like jetson nano and jetson xavier nx.