aws-samples / serverless-ai-workshop

This workshop demonstrates two methods of machine learning inference for global production using AWS Lambda and Amazon SageMaker
MIT No Attribution
57 stars 23 forks source link

Cannot create lambda due to code size #5

Open cipri-tom opened 5 years ago

cipri-tom commented 5 years ago

Hi,

Thank you for the very detailed instructions on how to achieve serverless AI!

I am having trouble creating a lambda function, as the expanded size is more than 250 Mb. I have only done pip install --target=. sagemaker, but that has installed a lot of dependencies:

Installing collected packages: jmespath, six, python-dateutil, urllib3, docutils, botocore, s3transfer, boto3, numpy, setuptools, protobuf, scipy, protobuf3-to-dict, texttable, jsonschema, dockerpty, docopt, idna, chardet, certifi, requests, docker-pycreds, websocket-client, pyasn1, asn1crypto, pycparser, cffi, cryptography, pynacl, bcrypt, paramiko, docker, PyYAML, cached-property, docker-compose, sagemaker

I did run also the find . -name "*.so" | xargs strip command, but that has only reduced the size by about 7 Mb, totalling, now, 271 Mb.

Note that I am doing this locally on my computer, rather than on the sagemaker hosted terminal. Could this be the reason why? I mean, maybe boto3 and all its dependencies already exist in that environment, so pip would not have to install them?

tbass134 commented 4 years ago

I did this locally as well, and got it down to 57mb, however, the limit is 50mb

movaldivia commented 3 years ago

I have the same issue. Any other solution to deploy a batch transformation. I need to predict images category one a day. @cipri-tom @tbass134 @skrinak @rumiio @hyandell

cipri-tom commented 3 years ago

@movaldivia another solution is to attach some storage to your lambda https://aws.amazon.com/blogs/aws/new-a-shared-file-system-for-your-lambda-functions/

This allows you to upload the dependencies, including the model weights over there

movaldivia commented 3 years ago

@cipri-tom thank you!