Closed WaterKnight1998 closed 1 year ago
Please read the README in that folder: https://github.com/triton-inference-server/server/tree/main/docs/examples. There are tutorials to follow to make use of this folder.
If you'd like to try loading them without the tutorials, you just need to run the fetch_models.sh
script in that folder to get the necessary files for the densenet_onnx
and inception_graphdef
models. They aren't meant to be run standalone, so I'd highly recommend following the tutorial.
Description Default models from
/docs/examples/model_repository
are not workingTriton Information What version of Triton are you using? 2.34.0
Are you using the Triton container or did you build it yourself? Triton container:
nvcr.io/nvidia/tritonserver:23.05-py3
To Reproduce docker run --gpus=1 --rm --net=host -v ${PWD}/docs/examples/model_repository:/models nvcr.io/nvidia/tritonserver:23.05-py3 tritonserver --model-repository=/models
Expected behavior All defualt models should work.