Description
Hi, I'm trying to use conda-packs inside my configs.pbtxt for Python backends. When I add them, they load successfully, but the model with backend: "onnxruntime" reports the following error:
E0815 11:26:46.113871 1624 model_lifecycle.cc:626] failed to load 'embedder' version 1: Internal: failed to stat file server/embedder/1/model.onnx
Without conda-packs in config.pbtxt, Triton Server and the conda environment work without any problems. I tried adding the conda environment to the ONNX Runtime model configuration, but it didn't change anything.
There are no issues with the path to the model. I also tried using the default_model_filenameoption, but it didn’t help.
Triton Information
Now I'm using 23.07, tried to use 24.07, didn't change anything.
Are you using the Triton container or did you build it yourself?
Default container + conda with libs for python-backend models.
Description Hi, I'm trying to use conda-packs inside my configs.pbtxt for Python backends. When I add them, they load successfully, but the model with
backend: "onnxruntime"
reports the following error:E0815 11:26:46.113871 1624 model_lifecycle.cc:626] failed to load 'embedder' version 1: Internal: failed to stat file server/embedder/1/model.onnx
Without conda-packs in config.pbtxt, Triton Server and the conda environment work without any problems. I tried adding the conda environment to the ONNX Runtime model configuration, but it didn't change anything. There are no issues with the path to the model. I also tried using thedefault_model_filename
option, but it didn’t help. Triton Information Now I'm using 23.07, tried to use 24.07, didn't change anything.Are you using the Triton container or did you build it yourself? Default container + conda with libs for python-backend models.
To Reproduce Here is my config:
Here is logs:
Click to view logs
Expected behavior Using conda-packs in the Python backend shouldn't affect ONNX Runtime models.