Open thalesfc opened 5 months ago
Additionally, I was trying to add extra logging, but unfortunately i hit this other issue, described at https://repost.aws/questions/QUMB6LvtnmT9adZ7RMw90iXQ/sagemaker-inconsistently-logging-user-log-statements
Also the install part of the tutorial is broken. Those two versions are not compatible with the latest conda:
Do you have the requirements.txt in code folder? https://github.com/aws/amazon-sagemaker-examples/tree/main/sagemaker-script-mode/pytorch_bert/code In my test without this file we would have this ModelError
Hi @thalesfc @zhaoqizqwang, I do have the requirements.txt in my code folder. The code also generates all the required file while executing model.save(....). However, I am still receiving the same Error 500. Any tips or tricks?
Link to the notebook
https://github.com/aws/sagemaker-python-sdk/files/14076220/deploy-v4-any-pytorch-tutorial.md
Describe the bug "Host a Pretrained Model on SageMaker" tutorial fails with "An error occurred (ModelError) when calling the InvokeEndpoint operation: Received server error (500) from primary and could not load the entire response body."
To reproduce follow https://sagemaker-examples.readthedocs.io/en/latest/sagemaker-script-mode/pytorch_bert/deploy_bert_outputs.html#Deploy-Model
Logs
Expected behavior
error seeing
similar post
I made a similar post under aws/sagemaker-python-sdk: https://github.com/aws/sagemaker-python-sdk/issues/4395