triton-inference-server / server

The Triton Inference Server provides an optimized cloud and edge inferencing solution.
https://docs.nvidia.com/deeplearning/triton-inference-server/user-guide/docs/index.html
BSD 3-Clause "New" or "Revised" License
8.23k stars 1.47k forks source link

python backend: how does the conda environment support multiple versions #3961

Open zhaohb opened 2 years ago

zhaohb commented 2 years ago

Description

.
├── 1
├── config.pbtxt
└── test_env.tar.gz

1 directory, 2 files

The directory structure supported by Python Backend is shown above. I am not sure what to do if there are multiple versions and the conda environment of each version is inconsistent. Would it be possible to put test_env.tar.gz in the directory of the model version?,I dit that. like this:

.
├─ 1
│      ├── test_env.tar.gz
│     └── model.py
├── config.pbtxt

1 directories, 3 files

but, when I put the conda environment in the version directory, I don't know how to get the relative path because I can't get the corresponding version. How to solve it?

Triton Information 22.01

Are you using the Triton container or did you build it yourself? no

Tabrizian commented 2 years ago

It is not possible to do this. The execution environments are per model and not per model version. You need to create a separate model in order to host to different execution environments. I have marked this feature request as an enhancement.