triton-inference-server / tensorrtllm_backend

The Triton TensorRT-LLM Backend
Apache License 2.0
714 stars 108 forks source link

Failed install in nvcr.io/nvidia/tritonserver:24.08-trtllm-python-py3 #623

Open wwx007121 opened 1 month ago

wwx007121 commented 1 month ago

System Info

NVIDIA-SMI 535.104.12 Driver Version: 535.104.12 CUDA Version: 12.5 base docker image: tritonserver:24.08-trtllm-python-py3

Who can help?

I want install torchaudio in tritonserver:24.08-trtllm-python-py3. A conflict occurred between torchaudio and torch version.

Information

Tasks

Reproduction

I want to solve it by two solutions, but have failed. pip install torchaudio -> from tensorrt_llm.bindings.BuildInfo import ENABLE_MULTI_DEVICE ImportError: /usr/local/lib/python3.10/dist-packages/tensorrt_llm/bindings.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN3c1021throwNullDataPtrErrorEv

pip install torchaudio==2.4.0 -> import torchaudio :OSError: /usr/local/lib/python3.10/dist-packages/torchaudio/lib/libtorchaudio.so: undefined symbol: _ZNK5torch8autograd4Node4nameEv

how to solve it

Expected behavior

solve it

actual behavior

..

additional notes

..