Miserlou / Zappa

Serverless Python
https://blog.zappa.io/
MIT License
11.89k stars 1.2k forks source link

Packaging with cpu-only version of PyTorch #1851

Open ben0it8 opened 5 years ago

ben0it8 commented 5 years ago

I'm trying to deploy a Lambda function for a PyTorch model performing inference on CPU. In order to keep the 512 MB size limit for the /tmp folder, I want to build PyTorch from source, so I'm using:

pip3 install https://download.pytorch.org/whl/cpu/torch-1.0.1.post2-cp36-cp36m-linux_x86_64.whl

However, after I have my virtualenv set up, dependencies pip installed and I try to deploy the package, zappa installs the full PyTorch library -- which is over 500 MB on itself (using locally cached manylinux wheel). Any idea how to prevent that?

I'm trying to deploy using the latest zappa, python 3.6 on Ubuntu 16.04.

paulbisso commented 5 years ago

Did you figure this out? I have the same issue.

paulbisso commented 5 years ago

Actually, just got it figured out. Credit to https://github.com/pedrohbtp/pytorch-zappa-serverless for solving it.

Install pytorch from a requirements.txt file containing the following lines: -f https://download.pytorch.org/whl/nightly/cpu/torch_nightly.html torch-nightly==1.0.0.dev20181105

This worked for me, zappa deploys no problem.

jcomish commented 4 years ago

Hmm I tried using the approach provided in the linked repo with no success... I get the following error: ModuleNotFoundError: No module named 'torch._C'

Did you run into this?

jcomish commented 4 years ago

Aha, I found my problem, you need to deploy from a Linux OS. To those that may run into this problem in the future, be sure you use a more updated version of pytorch, as a warning would come up with the version listed in the answer provided by paulbisso that would break in a Lambda environment. I used the following: torch-nightly==1.0.0.dev20190113

Alternatively you could build this on your own with the whls provided from pytorch instead of deploying from Linux.