Toolkit for allowing inference and serving with PyTorch on SageMaker. Dockerfiles used for building SageMaker Pytorch Containers are at https://github.com/aws/deep-learning-containers.
Apache License 2.0
134
stars
72
forks
source link
Is this Dockerfile compatible with sagemaker elastic inference #128
What did you find confusing? Please describe. This is Dockerfile link: https://github.com/aws/sagemaker-pytorch-inference-toolkit/blob/master/docker/1.5.0/py3/Dockerfile.cpu This link https://docs.aws.amazon.com/sagemaker/latest/dg/ei-endpoints.html#ei-endpoints-pytorch states: You can download the Elastic Inference enabled binary for PyTorch from the public Amazon S3 bucket at console.aws.amazon.com/s3/buckets/amazonei-pytorch. For information about building a container that uses the Elastic Inference enabled version of PyTorch, see Building your image. I am confused. If I use the Dockerfile above, do I still need to download and install https://console.aws.amazon.com/s3/buckets/amazonei-pytorch to build docker container image? If I want to use customer docker image for Sagemaker elastic inference, do I need to convert pytorch code into torchscript? This part is not covered. Can I use it for Python version >=3.7 and PyTorch version >=1.12?
Describe how documentation can be improved A clear and concise description of where documentation was lacking and how it can be improved.
Additional context Add any other context or screenshots about the documentation request here.