Closed YoushaaMurhij closed 3 years ago
Great work! have you tested this model on Jetson devices like TX2 or Xavier ? or Do you have any plan to implement it in the future ?
Yep, I tested it on Xavier. As long as the environment is consistent (TensorRT 7.1.0+,CUDA 10.2, onnx2trt ), running this repo is basically no problem. But you need to note that there are many imperfections that you need to improve.
@hova88 @YoushaaMurhij Hi.PointPillars_MultiHead_40FPS with trt model in xavier, how much inference time takes?
Great work! have you tested this model on Jetson devices like TX2 or Xavier ? or Do you have any plan to implement it in the future ?