triton-inference-server / server

The Triton Inference Server provides an optimized cloud and edge inferencing solution.
https://docs.nvidia.com/deeplearning/triton-inference-server/user-guide/docs/index.html
BSD 3-Clause "New" or "Revised" License
7.82k stars 1.42k forks source link

Triton Inference Server installation failure #6417

Open zhenxinxu opened 9 months ago

zhenxinxu commented 9 months ago

I use image nvcr.io/nvidia/tritonserver:23.09-py3-min to build triton ;

I used the following image nvcr.io/nvidia/tritonserver:23.09-py3-min to build triton to compile and install triton. The compilation command was as follows ./build.py -v --no-container-build --enable-logging --build-dir=pwd/build --enable-all --no-container-source , but I received the following error. How can I compile successfully? image

tanmayv25 commented 9 months ago

I think the error has not been captured here. Can you take a look at the logs for what failed?

zhenxinxu commented 9 months ago

image

zhenxinxu commented 9 months ago

image

I got the error

zhenxinxu commented 9 months ago

I think the error has not been captured here. Can you take a look at the logs for what failed? image