Open jhw opened 6 years ago
In these cases, I think I read of a different approach somewhere. You could try getting the source of only the functions you need from scipy and copy them into your Chalice lib.
Thanks for the link, didn't know about that.
To address your issue, you can make use of the chalice package
workflow, which generates a SAM template that is deployed through CloudFormation:
# Contains the archive and template
chalice package out/
# Outputs *CloudFormation template*
aws cloudformation package --template-file out/sam.json --s3-bucket mybucket
The aws cloudformation package
command will upload the archive to S3, and output a SAM template containing
Resources:
APIHandler:
Properties:
CodeUri: s3://mybucket/92a8af20a990fc350d0c5c7023c89cc6
Prior to deploying this template, you should be able to mess with the object in S3 or the template.
So is this a general feature request for trying to make packages >50mb
work with Lambda? That would be pretty cool, I'll mark this as a feature request. If that wasn't what you were talking about let me know. Using S3 rather than uploading straight to lambda did not help when we tried it.
My team uses copies to S3 to load the lambda zip file and we've been fine with 67MB zips.
@kadrach I tried aws cloudformation package
and got the package onto my bucket and deployed it to lambda using https://www.edureka.co/community/36504/how-to-upload-a-file-from-s3-in-lambda but I keep getting internal server errors when I try to call the REST API from curl
. The hello world template works perfectly and chalice local
works perfectly but deploying through s3 does not seem to work for me.
I have a Chalice app which is breaching the Lambda size limit, probably due to the inclusion of
scipy
-This article suggests its possibly to bypass the size restriction by deploying via S3 -
https://hackernoon.com/exploring-the-aws-lambda-deployment-limits-9a8384b0bec3
is there any
chalice deploy
support (or equivalent) for deploying in this fashion ?