triton-inference-server / server

The Triton Inference Server provides an optimized cloud and edge inferencing solution.
https://docs.nvidia.com/deeplearning/triton-inference-server/user-guide/docs/index.html
BSD 3-Clause "New" or "Revised" License
8.34k stars 1.49k forks source link

How to call custom python node's execute() and initialize() methods fron another python code? #6603

Open denti opened 1 year ago

denti commented 1 year ago

Hi team! I wanna test my custom python module (model.py) in Python before sending it to the Triton server. So I have Python interpreter and stand-alone file test.py where I wanna do like this:

import triton_python_backend_utils as pb_utils  

test_tensor_for_my_backend = pb_utils.Tensor("input.1", np.array(tmp)) 
test_inference_request  = pb_utils.InferenceRequest(test_tensor_for_my_backend)

from model import TritonPythonModel

backend_to_test= TritonPythonModel() 
backend_to_test.initialize({"some":"args"}) 
backend_to_test.execute(test_inference_request)

when I'm trying to write this simple Python code in the TRITON server container (from NGC) And I'm exporting triton_python_backend_utils.py from /opt/tritonserver/backends/python I'm getting an error: AttributeError: module 'triton_python_backend_utils' has no attribute 'Tensor'

So what is the right way of doing this?

kthui commented 12 months ago

Hi @denti, did you manage to "install" triton_python_backend_utils, such that it is resolvable outside Python backend environment? This issue looks similar to https://github.com/triton-inference-server/server/issues/5961 ask.

denti commented 12 months ago

Hi @kthui thanks a lot for the info! Nope I couldn't "install" triton_python_backend_utils. And mine issue is the same as you attached #5961 Could you please advice on how to install triton_python_backend_utils in a prover way inside triton server container to have ability to interact with pb_utils.Tensor and pb_utils.InferenceRequest in Python tests?

kthui commented 12 months ago

Thanks for the update. There would be some effort to install triton_python_backend_utils outside Triton. It is likely easier to simply run the code on Python backend with Triton. You could download a pre-build Triton container from NGC and test everything locally. I have added this issue into our ticket on enhancing Python backend developer experience.

cc @Tabrizian if you know a simply way of mocking the functionality of triton_python_backend_utils outside Triton.

denti commented 11 months ago

@Tabrizian hi, could you please inform me how can I create instances of this classes pb_utils.Tensor and pb_utils.InferenceRequest friom inside container nvcr.io/nvidia/tritonserver:23.10-py3 ? Because if I just trying to import them in python I'm getting error AttributeError: module 'triton_python_backend_utils' has no attribute 'Tensor'