cyrusbehr / tensorrt-cpp-api

TensorRT C++ API Tutorial
MIT License
596 stars 74 forks source link

support cuda 11.8 #46

Closed sammilei closed 8 months ago

sammilei commented 8 months ago

Thanks for the code! It was able to run with cuda tool kit 12.1, driver 11.8 and TensorRT-8.6.1.6 for 12.1. I would need the trt inference on cuda toolkit 11.8. What I have tried:

  1. with TensorRT-8.6.1.6 for 11.8. cmake and make passed but when I ran the program, I got
    libcublas.so.12 => not found
    libcublasLt.so.12 => not found

    I tried softlink the xxxx.so.11 to 12 but the program detected it and complained.

    ldd run_inference_benchmark  | grep found
    ./run_inference_benchmark: /lib/x86_64-linux-gnu/libcublas.so.12: version `libcublas.so.12' not found (required by /lib/x86_64-linux-gnu/libnvinfer_plugin.so.8)
    ./run_inference_benchmark: /lib/x86_64-linux-gnu/libcublasLt.so.12: version `libcublasLt.so.12' not found (required by /lib/x86_64-linux-gnu/libnvinfer_plugin.so.8)
  2. I saw your release 1.0 too but that tensor version was for up to 11.5.
  3. I also tried TensorRT-8.4.1.5 but there were functions in the current code that don't exist in this TensorRT version.

Any suggestions are appreciated!

cyrusbehr commented 8 months ago

I use TensorRT 8.6 and CUDA 11.8 in my environment and works fine. Make sure you download the correct version of TensorRT. You likely downloaded for CUDA 12.x, whereas you need to download for CUDA 11.x.

See my answer in the other issue: https://github.com/cyrusbehr/tensorrt-cpp-api/issues/45

sammilei commented 8 months ago

Thanks a lot for the quick reply! Didn't realize you had answered my question in a different thread. I thought the post failed. I indeed reinstalled cuda a few times. Maybe I should reinstall my ubuntu 20.04 from scratch