DimaBir / ResNetTensorRT

Optimized Inference with ResNet-50: A demonstration of inference performance using PyTorch, TensorRT, ONNX, and OpenVINO. Includes benchmarks, predictions, and model exporters.
GNU General Public License v3.0
6 stars 2 forks source link

ONNX Inferece with CUDA Runtime #9

Open deo-abhijit opened 5 days ago

deo-abhijit commented 5 days ago

Hey, If you are open for this feature, i can add onnx inference benchmark with cuda execution provider.

DimaBir commented 5 days ago

Hey, If you are open for this feature, i can add onnx inference benchmark with cuda execution provider.

Sure, I appreciate any contribution that can help others

deo-abhijit commented 2 days ago

Hi @DimaBir, Could you please check if the GPU docker image is working fine or not? I tried out the docker image by running python main.py but i got following error. Things are working fine with my local machine, but when i tried the docker image, i was unable to run main.py.

OSError: libnvinfer_plugin.so.10: cannot open shared object file: No such file or directory