Open farzanehnakhaee70 opened 2 years ago
Hi!
Almost 2 years later, I too experienced this problem. Can't deploy multiple models in Sagemaker due to the fact I can't pass the "--shm-size"
flag to the container...
Did you ever resolve this? Must I switch from python_backend in order to deploy?
Describe the bug I am deploying an ensemble of an NLP model. While running the code specified, I get this error:
Based on my investigation, each of the directories with python_backend, needs 64MB of shm. On the other hand, there isn't any option to change the shm_size of the container. Then, how we can solve the problem?