Describe the bug
When following this notebook, getting an error when creating the endpoint. Endpoint creation fails with error: creating server: Invalid argument - load failed for model '/opt/ml/model/::t5_pytorch': version 1 is at UNAVAILABLE state: Internal: AssertionError:
error in the Cloudwatch.
To reproduce
Followed the above notebook for T5 model deployment, getting error at creating the endpoint.
Logserror: creating server: Invalid argument - load failed for model '/opt/ml/model/::t5_pytorch': version 1 is at UNAVAILABLE state: Internal: AssertionError:
Link to the notebook https://github.com/aws/amazon-sagemaker-examples/blob/main/inference/nlp/realtime/triton/single-model/t5_pytorch_python-backend/t5_pytorch_python-backend.ipynb
Describe the bug When following this notebook, getting an error when creating the endpoint. Endpoint creation fails with
error: creating server: Invalid argument - load failed for model '/opt/ml/model/::t5_pytorch': version 1 is at UNAVAILABLE state: Internal: AssertionError:
error in the Cloudwatch. To reproduce Followed the above notebook for T5 model deployment, getting error at creating the endpoint.Logs
error: creating server: Invalid argument - load failed for model '/opt/ml/model/::t5_pytorch': version 1 is at UNAVAILABLE state: Internal: AssertionError: