aws / sagemaker-mxnet-inference-toolkit

Toolkit for allowing inference and serving with MXNet in SageMaker. Dockerfiles used for building SageMaker MXNet Containers are at https://github.com/aws/deep-learning-containers.
Apache License 2.0
28 stars 31 forks source link

How can I build this project without AWS CodeBuild? #101

Closed jamiekang closed 4 years ago

jamiekang commented 4 years ago

Hello, I am trying to build this within linux (Amazon EC2) command line. How can I run "buildspec.yml" in terminal? Anytool to convert buildspec.yml into sh script?

Thanks.

nadiaya commented 4 years ago

"buildspec.yml" is used for triggering and running tests on PRs and contains a list of shell commands to do so.

Could you give us more details about your use case and what are you trying to achieve?

jamiekang commented 4 years ago

TL;DR: the Dockerfile.gpu should be modified. (maybe the Dockerfile.cpu too) It's trying to copy sagemaker_mxnet_inference.tar.gz but it should be sagemaker_mxnet_serving_container.tar.gz.

I was about to build my image as guided by the "Building Images" section of README.rst. (Env: python3, gpu(ec2 p3), mxnet 1.6.0, no EI) But I failed in running docker build -t preprod-mxnet-serving:1.6.0-gpu-py3 -f Dockerfile.gpu . command.

This is the error message I got:

Step 15/26 : WORKDIR /
 ---> Using cache
 ---> d504cb460349
Step 16/26 : COPY sagemaker_mxnet_inference.tar.gz /sagemaker_mxnet_inference.tar.gz
COPY failed: stat /var/lib/docker/tmp/docker-builder900894127/sagemaker_mxnet_inference.tar.gz: no such file or directory

I found the reason is because of some missing commands like these specified in buildspec.yml:

But your answer is that the file is for test. So I did more experiments and found the Dockerfile.gpu should be modified as in TL;DR.

nadiaya commented 4 years ago

The latest dockerfiles are correct as the package name has been changed from sagemaker_mxnet_serving_container.tar.gz to sagemaker_mxnet_inference.tar.gz.

I will update outdated documentation.

jamiekang commented 4 years ago

Thanks!