Closed wangjin2945 closed 6 months ago
Hi, it may be caused by network unstable or the number of user concurrent requests exceeding our API call's limit. You can try it later. Thanks.
But did you use the raw files we provided? It seems like we didn't have a file named "model. py(145)"
Thank you WhirlFirst! I did not use any other files. I simply run example.sh in ./apiexample. I just tried it again but it still not work.
Hi, Can you try it now? The engineer has restarted the service and it should work now. Sorry for the delay in using API.
Thank you WhirlFirst! It works now.
I got the same error when I ran example.sh in ./apiexample
When I was running the example.sh, as error had occured and the return massage was:
code:400 text:{"error":"Failed to process the request(s) for model instance 'xtrimogene_0', message: TypeError: init(): incompatible constructor arguments. The following argument types are supported:\n 1. c_python_backend_utils.InferenceResponse(output_tensors: List[c_python_backend_utils.Tensor], error: c_python_backend_utils.TritonError = None)\n\nInvoked with: kwargs: error=<c_python_backend_utils.TritonError object at 0x7fc3b3527a30>\n\nAt:\n /work/xtrimogene/service/xtrimogene/1/model.py(145): execute\n"} reason:Bad Request
What happened? Is it the problem of client or server?