triton-inference-server / onnxruntime_backend

The Triton backend for the ONNX Runtime.
BSD 3-Clause "New" or "Revised" License
125 stars 54 forks source link

Support initializer as inference request input. Add test #180

Closed GuanLuo closed 1 year ago

GuanLuo commented 1 year ago

Mirror from https://github.com/triton-inference-server/onnxruntime_backend/pull/177 contributed by @taoisu