Toolkit for allowing inference and serving with PyTorch on SageMaker. Dockerfiles used for building SageMaker Pytorch Containers are at https://github.com/aws/deep-learning-containers.
What did you find confusing? Please describe.
kindly tell what is the dockerfile format in order to use cuda/gpu on sagemaker instance, i heard we have to use sagemaker-inference-toolkit , i tried using simple cuda image but its having problem serving
Describe how documentation can be improved
there was no examples/samples related to building image with cuda support
Additional context
Add any other context or screenshots about the documentation request here.
What did you find confusing? Please describe. kindly tell what is the dockerfile format in order to use cuda/gpu on sagemaker instance, i heard we have to use sagemaker-inference-toolkit , i tried using simple cuda image but its having problem serving Describe how documentation can be improved there was no examples/samples related to building image with cuda support Additional context Add any other context or screenshots about the documentation request here.