Closed nv-kmcgill53 closed 8 months ago
This PR parameterizes the github repository used when building.
Related internal PRs: Server: https://github.com/triton-inference-server/server/pull/6934 Core: https://github.com/triton-inference-server/core/pull/332 Backend: https://github.com/triton-inference-server/backend/pull/96 Checksum repository agent: https://github.com/triton-inference-server/checksum_repository_agent/pull/10 Dali backend: https://github.com/triton-inference-server/dali_backend/pull/228 Identity backend: https://github.com/triton-inference-server/identity_backend/pull/29 Onnxruntime backend: https://github.com/triton-inference-server/onnxruntime_backend/pull/244 Openvino backend: https://github.com/triton-inference-server/openvino_backend/pull/68 pytorch backend: https://github.com/triton-inference-server/pytorch_backend/pull/124 Redis cache: https://github.com/triton-inference-server/redis_cache/pull/14 Repeat backend: https://github.com/triton-inference-server/repeat_backend/pull/11 Square backend: https://github.com/triton-inference-server/square_backend/pull/18 Tensorflow backend: https://github.com/triton-inference-server/tensorflow_backend/pull/101 Tensorrt backend: https://github.com/triton-inference-server/tensorrt_backend/pull/81 Python backend:https://github.com/triton-inference-server/python_backend/pull/341 Client: https://github.com/triton-inference-server/client/pull/485
Related third party PRs: https://github.com/triton-inference-server/server/pull/6668
This PR parameterizes the github repository used when building.
Related internal PRs: Server: https://github.com/triton-inference-server/server/pull/6934 Core: https://github.com/triton-inference-server/core/pull/332 Backend: https://github.com/triton-inference-server/backend/pull/96 Checksum repository agent: https://github.com/triton-inference-server/checksum_repository_agent/pull/10 Dali backend: https://github.com/triton-inference-server/dali_backend/pull/228 Identity backend: https://github.com/triton-inference-server/identity_backend/pull/29 Onnxruntime backend: https://github.com/triton-inference-server/onnxruntime_backend/pull/244 Openvino backend: https://github.com/triton-inference-server/openvino_backend/pull/68 pytorch backend: https://github.com/triton-inference-server/pytorch_backend/pull/124 Redis cache: https://github.com/triton-inference-server/redis_cache/pull/14 Repeat backend: https://github.com/triton-inference-server/repeat_backend/pull/11 Square backend: https://github.com/triton-inference-server/square_backend/pull/18 Tensorflow backend: https://github.com/triton-inference-server/tensorflow_backend/pull/101 Tensorrt backend: https://github.com/triton-inference-server/tensorrt_backend/pull/81 Python backend:https://github.com/triton-inference-server/python_backend/pull/341 Client: https://github.com/triton-inference-server/client/pull/485
Related third party PRs: https://github.com/triton-inference-server/server/pull/6668