triton-inference-server / server

The Triton Inference Server provides an optimized cloud and edge inferencing solution.
https://docs.nvidia.com/deeplearning/triton-inference-server/user-guide/docs/index.html
BSD 3-Clause "New" or "Revised" License
8.14k stars 1.46k forks source link

gRPC version mismatch between the Triton client and DeepStream #5918

Closed beomsik-park98 closed 1 year ago

beomsik-park98 commented 1 year ago

Hi,

My application utilizes both the Triton client library and DeepStream for their respective purposes. However, there is a discrepancy in the gRPC versions they employ, with one using v1.48.0 and the other using v1.38.0. Consequently, this causes an error during the compilation process.

Is there an effective approach to integrating these two libraries within a single program?

Note: I'm using c++ Triton client library.

nv-kmcgill53 commented 1 year ago

CC: @jbkyang-nvi

jbkyang-nvi commented 1 year ago

Can you update the deepstream library? If not. I you can fork the third_party repo, and change these lines in the client cmake list: https://github.com/triton-inference-server/client/blob/main/CMakeLists.txt#L66-L69 to your forked branch.

beomsik-park98 commented 1 year ago

Thank you. I'll try it.