Hi there, I'm having a problem trying to make inference with a model downloaded from the hub. I was able to run the nuclei segmentation model, but then I installed TensorRT and can no longer make inference with any model. The error message is: Error parsing pipeline! Failed to build CUDA engine for TensorRT. I verified TensorRT installation as suggested in Nvidia installation guide and it seems to be working fine.
I'm working with Ubuntu 20.04, cudnn 8.8, TensorRT 8.5.3.1, cuda 11.7 and GeForce GTX 1660 Ti Mobile.
Hi there, I'm having a problem trying to make inference with a model downloaded from the hub. I was able to run the nuclei segmentation model, but then I installed TensorRT and can no longer make inference with any model. The error message is: Error parsing pipeline! Failed to build CUDA engine for TensorRT. I verified TensorRT installation as suggested in Nvidia installation guide and it seems to be working fine.
I'm working with Ubuntu 20.04, cudnn 8.8, TensorRT 8.5.3.1, cuda 11.7 and GeForce GTX 1660 Ti Mobile.