triton-inference-server / onnxruntime_backend

The Triton backend for the ONNX Runtime.
BSD 3-Clause "New" or "Revised" License
134 stars 57 forks source link

How do I get all possible input and output names? #267

Open zhmiao opened 3 months ago

zhmiao commented 3 months ago

Hello. I am pretty new to using onnxruntime with triton inference server. I wonder how can I get all possible input and output names for my onnx models? Currently I can only follow existing config.pbtxt files, and I don't know how people are getting different input and output names for their models. Thank you so much!