aws / aws-cdk

The AWS Cloud Development Kit is a framework for defining cloud infrastructure in code
https://aws.amazon.com/cdk
Apache License 2.0
11.72k stars 3.94k forks source link

(pipelines): >256 MB cdk.out breaks the pipeline #21325

Open misterjoshua opened 2 years ago

misterjoshua commented 2 years ago

Describe the bug

CDK Pipelines fail with Artifact [Synth_Output] exceeds max artifact size when the total size of the cdk.out increases beyond 256 MB.

Per the CodePipeline documentation, CodePipeline artifacts support S3 artifact sizes up to 5 GB, however, the maximum size of input artifacts for AWS CloudFormation actions is 256 MB. Because CDK Pipelines sends the entire cdk.out to CloudFormation actions, the pipeline can't take advantage of the full CodePipeline service limit.

Expected Behavior

CDK Pipelines should send CloudFormation steps only the CloudFormation templates needed to create changesets and deploy them.

Current Behavior

The entire cdk.out is sent to the CloudFormation action. While the cdk.out contains the CloudFormation templates, it also includes all the assets. When an asset is bound to be large (such as for docker build contexts), the pipeline can easily exceed the CloudFormation step limits.

Pipeline Execution Timeline image

Failed Action Details image

The Artifact image

Reproduction Steps

I've created an SSCCE in a repo here: https://github.com/misterjoshua/cdk-pipelines-bug/blob/main/src/main.ts

To reproduce in the above repository:

Here's what the code in the repo looks like:

import { App, DockerImage, SecretValue, Stack, Stage } from 'aws-cdk-lib';
import * as aws_s3_assets from 'aws-cdk-lib/aws-s3-assets';
import * as pipelines from 'aws-cdk-lib/pipelines';

const app = new App();

const pipelineStack = new Stack(app, 'cdk-pipelines-bug-dev', {
  env: {
    account: process.env.CDK_DEFAULT_ACCOUNT,
    region: process.env.CDK_DEFAULT_REGION,
  },
});

const pipeline = new pipelines.CodePipeline(pipelineStack, 'CodePipeline', {
  synth: new pipelines.ShellStep('Synth', {
    input: pipelines.CodePipelineSource.gitHub('misterjoshua/cdk-pipelines-bug', 'main', {
      authentication: SecretValue.secretsManager('github-token'),
    }),
    commands: [
      'yarn install',
      'yarn build',
    ],
  }),

  dockerEnabledForSynth: true,
});

const stage = new Stage(pipelineStack, 'Stage');
const stageStack = new Stack(stage, 'Stack');

new aws_s3_assets.Asset(stageStack, 'Asset', {
  path: __dirname,
  bundling: {
    // Generate a difficult to compress, large file. (512MB)
    image: DockerImage.fromRegistry('bash'),
    command: ['dd', 'if=/dev/urandom', 'of=/asset-output/big.dat', 'bs=1048576', 'count=512'],
  },
});

pipeline.addStage(stage);

app.synth();

Possible Solution

CDK Pipelines should create an artifact containing only CloudFormation templates and use that for the CloudFormation steps rather than the entire cdk.out. This way, assets in cdk.out can grow up to the ~5GB S3 Artifact limit without impacting the ability to deploy the CloudFormation templates.

Additional Information/Context

No response

CDK CLI Version

2.33.0 (build 859272d)

Framework Version

2.33.0

Node.js Version

v14.19.1

OS

Linux

Language

Typescript

Language Version

TypeScript (4.7.4)

Other information

No response

uncledru commented 1 year ago

+1. Potential solution: Expose the action ouptutArtifact prop in the Assets stage. We can then use the partialBuildSpec to trim down the artifact before passing it to subsequent stages:

                partialBuildSpec: BuildSpec.fromObject({
                    phases: {
                        post_build: {
                            commands: ["rm -rfv ./asset.*"]
                        }
                    },
                    artifacts: {
                        files: ["**/*"],
                        name: "output_trimmed"
                    }
                })
Richardmbs12 commented 1 year ago

We are experiencing the same issue. We use java based lambda functions, each jar file is easily 60mb

We reach the 256mb CloudFormation limit very quickly.

mrtimp commented 1 year ago

I've run in to this error on a project that is deploying via CodePipe line. Any suggestions on how to approach fixing it?

mrtimp commented 1 year ago

After a discussion with someone from the CodePipeline team at Enterprise Support the fix was rather trivial. In my case (and may not be the same for others) was to simply remove the cdk.out/assets.* directories before finishing the commands including cdk synth. This dropped the asset ZIP file size from > 256MB to < 1MB:

screenshot_ms_392
Kintar commented 1 month ago

+1 for this report. The workaround mentioned by @mrtimp does not work when the lambda functions are based on docker images, since the assets from Synth_Output are required by the "Asset" stage to actually build the docker images.