Open zeynepkoyun opened 2 years ago
Were you able to run mxnet models with Triton Inference Server?
All insightface models runs on triton after conversion to onnx or trt, though I'm not sure if this repo currently works with Triton, since I haven't updated its support for a while.
Were you able to run mxnet models with Triton Inference Server?