triton-inference-server / onnxruntime_backend

The Triton backend for the ONNX Runtime.
BSD 3-Clause "New" or "Revised" License
125 stars 54 forks source link

Support arbitrary options for execution providers #217

Open gedoensmax opened 10 months ago

gedoensmax commented 10 months ago

Is your feature request related to a problem? Please describe.

This will solve many more requests for extending the option catalogue available. It should also reduce maintenance overhead for options in the future.

Describe the solution you'd like

I believe one could exchange the parameter parsing here: https://github.com/triton-inference-server/onnxruntime_backend/blob/main/src/onnxruntime.cc#L456-L523 To something more flexible like the onnxruntime_perf_test is doing: https://github.com/microsoft/onnxruntime/blob/ed89ca573a83e5adcf86a8e2cae912b6eeb9a335/onnxruntime/test/perftest/ort_test_session.cc#L132-L160

Issues that should be solved by this are: #168, #166 #194