aws / chalice

Python Serverless Microframework for AWS
Apache License 2.0
10.61k stars 1.01k forks source link

Chalice deployment fails due to large package size #2115

Open shekharpalit opened 1 month ago

shekharpalit commented 1 month ago

Description

My Chalice application is failing to deploy due to the deployment package exceeding the 50MB limit for Lambda functions. The application includes Langchain packages for data processing, which contribute to the large package size.

Current Configuration

My config.json for the Chalice application:

{
  "app_name": "my-app-name",
  "automatic_layer": true,
  "stages": {
    "prod": {
      "api_gateway_stage": "api",
      "environment_variables": {},
      "iam_role_arn": "r",
      "manage_iam_role": false,
      "lambda_memory_size": 3008,
      "lambda_timeout": 900
    },
    "staging": {
      "api_gateway_stage": "api",
      "environment_variables": {},
      "iam_role_arn": "r",
      "log_level": "DEBUG",
      "manage_iam_role": false,
      "lambda_memory_size": 3008,
      "lambda_timeout": 900
    }
  },
  "version": "2.0"
}

Error Message

chalice.deploy.deployer.ChaliceDeploymentError: ERROR - While sending your chalice handler code to Lambda to 
publish_layer_version function "my-fucntion-name-layer", 
received the following error:
 An error occurred (RequestEntityTooLargeException) when calling the 
 PublishLayerVersion operation: Request must be smaller than 70167211 bytes for
  the PublishLayerVersion operation
This is likely because the deployment package is 51.0 MB. Lambda only allows 
deployment packages that are 50.0 MB or less in size. To avoid this error, 
decrease the size of your chalice application by removing code or removing 
dependencies from your chalice application.

Question

How can I create multiple layers for my Lambda function to support deployments larger than 50MB? I'm using Langchain packages for data processing, which are contributing to the large package size.

Additional Information

What I've Tried