Open orderer0001 opened 1 year ago
@orderer0001 You can deploy tensorrt on Deepstream or Triton Server of Nvidia. I deployed at: https://github.com/ChuRuaNh0/Ai_engineer/blob/main/Deploy/Triton-inference-server/docs/triton_tensorrt.md. You can refer.
@orderer0001 You can deploy tensorrt on Deepstream or Triton Server of Nvidia. I deployed at: https://github.com/ChuRuaNh0/Ai_engineer/blob/main/Deploy/Triton-inference-server/docs/triton_tensorrt.md. You can refer.
this page cannot be opened
@orderer0001 I public this repo. You can refer
Hello, can the trace also give an example of tensorrt deployment?