triton-inference-server / server

The Triton Inference Server provides an optimized cloud and edge inferencing solution.
https://docs.nvidia.com/deeplearning/triton-inference-server/user-guide/docs/index.html
BSD 3-Clause "New" or "Revised" License
8.04k stars 1.44k forks source link

error: creating server: Internal - s3:// file-system not supported. To enable, build with -DTRITON_ENABLE_S3=ON. #7582

Open shahizat opened 2 weeks ago

shahizat commented 2 weeks ago

Hello,

When I pass S3 path, I receive the following error: error: creating server: Internal - s3:// file-system not supported. To enable, build with -DTRITON_ENABLE_S3=ON

The pod is based on the nvcr.io/nvidia/tritonserver:24.07-py3-igpu image and is running on the NVIDIA Jetson AGX Orin development kit. If possible, please provide another image.

Thank you in advance for your help.

shahizat commented 1 week ago

Hello Nvidia team, If possible, could you please suggest the correct build command for setting up S3 support? I also noticed that the NCCL package is not included as well. To install it, I must use the command "sudo apt install libnccl2 libnccl-dev".

I was using the following command:

sudo python3 build.py \
    --build-parallel 10 \
    --no-force-clone \
    --target-platform igpu \
    --target-machine aarch64 \
    --filesystem s3 \
    --enable-gpu \
    --enable-mali-gpu \
    --enable-metrics \
    --enable-logging \
    --enable-stats \
    --enable-cpu-metrics \
    --enable-nvtx \
    --backend onnxruntime \
    --backend pytorch \
    --backend tensorflow \
    --backend python \
    --backend tensorrt \
    --endpoint http \
    --endpoint grpc \
    --min-compute-capability "5.3" \
    --image "base,nvcr.io/nvidia/${IMAGE_NAME}:${OFFICIAL_MIN_IMAGE_TAG}" \
    --image "gpu-base,nvcr.io/nvidia/${IMAGE_NAME}:${OFFICIAL_MIN_IMAGE_TAG}