triton-inference-server / server

The Triton Inference Server provides an optimized cloud and edge inferencing solution.
https://docs.nvidia.com/deeplearning/triton-inference-server/user-guide/docs/index.html
BSD 3-Clause "New" or "Revised" License
8.12k stars 1.46k forks source link

Python backend cannot import Tensor #4870

Open Phelan164 opened 2 years ago

Phelan164 commented 2 years ago

Description Python backend model import Tensor from triton_python_backend_utils import Tensor Got error: UNAVAILABLE: Internal: ImportError: cannot import name 'Tensor' from 'triton_python_backend_utils' (/opt/tritonserver/backends/python/triton_python_backend_utils.py Triton inference server version 22.05 still work well

If import from c_python_backend_utils it work

Triton Information Triton inference server version 22.07

To Reproduce Steps to reproduce the behavior.

Describe the models (framework, inputs, outputs), ideally include the model configuration file (if using an ensemble include the model configuration file for that as well).

Expected behavior A clear and concise description of what you expected to happen.

kthui commented 2 years ago

Hi @Tabrizian, do you know if from triton_python_backend_utils import Tensor import is supported in the Python backend?

Phelan164 commented 2 years ago

@kthui @Tabrizian any update for this issue

Tabrizian commented 1 year ago

@Phelan164 Sorry for delay in responding to this thread. Importing Tensor from triton_python_backend_utils is not supported before initialize function is called. We can fix this issue in Python models. I'll file a ticket for the same.

Phelan164 commented 1 year ago

@Tabrizian thanks for your information.

SamSamhuns commented 1 year ago

Hi is this issue fixed, I still cannot run from triton_python_backend_utils import Tensor in nvcr.io/nvidia/tritonserver:22.12-py3?

dyastremsky commented 7 months ago

Related issue: https://github.com/triton-inference-server/server/issues/6535

Ref: 4234