Hello. I am pretty new to using onnxruntime with triton inference server. I wonder how can I get all possible input and output names for my onnx models? Currently I can only follow existing config.pbtxt files, and I don't know how people are getting different input and output names for their models. Thank you so much!
Hello. I am pretty new to using onnxruntime with triton inference server. I wonder how can I get all possible input and output names for my onnx models? Currently I can only follow existing config.pbtxt files, and I don't know how people are getting different input and output names for their models. Thank you so much!