triton-inference-server / onnxruntime_backend

The Triton backend for the ONNX Runtime.
BSD 3-Clause "New" or "Revised" License
125 stars 54 forks source link

Parameterize git repository #244

Closed nv-kmcgill53 closed 7 months ago

nv-kmcgill53 commented 7 months ago

This PR parameterizes the github repository used when building.

Related internal PRs: Server: https://github.com/triton-inference-server/server/pull/6934 Core: https://github.com/triton-inference-server/core/pull/332 Backend: https://github.com/triton-inference-server/backend/pull/96 Checksum repository agent: https://github.com/triton-inference-server/checksum_repository_agent/pull/10 Dali backend: https://github.com/triton-inference-server/dali_backend/pull/228 Identity backend: https://github.com/triton-inference-server/identity_backend/pull/29 Local cache: https://github.com/triton-inference-server/local_cache/pull/12 Openvino backend: https://github.com/triton-inference-server/openvino_backend/pull/68 pytorch backend: https://github.com/triton-inference-server/pytorch_backend/pull/124 Redis cache: https://github.com/triton-inference-server/redis_cache/pull/14 Repeat backend: https://github.com/triton-inference-server/repeat_backend/pull/11 Square backend: https://github.com/triton-inference-server/square_backend/pull/18 Tensorflow backend: https://github.com/triton-inference-server/tensorflow_backend/pull/101 Tensorrt backend: https://github.com/triton-inference-server/tensorrt_backend/pull/81 Python backend:https://github.com/triton-inference-server/python_backend/pull/341 Client: https://github.com/triton-inference-server/client/pull/485

Related third party PRs: https://github.com/triton-inference-server/server/pull/6668