aws / aws-cdk

The AWS Cloud Development Kit is a framework for defining cloud infrastructure in code
https://aws.amazon.com/cdk
Apache License 2.0
11.6k stars 3.9k forks source link

codepipeline: Deploying cross-account and regions while reusing existing S3 bucket and KMS key. #26557

Closed vipetrul closed 11 months ago

vipetrul commented 1 year ago

Describe the bug

Pipeline defined in region us-east-2. Pipeline uses existing S3 artifacts buckets with KMS key. Pipelined includes two stages to deploy stacks into two regions us-east-1 and us-east-2 in target account. Note: when deploying just to one region us-east-2 in target account, everything works fine.

Expected Behavior

Pipeline is successfully created which includes two stages that deploy to "us-east-1" and "us-east-2" regions in target account.

Current Behavior

Exception is received during cdk synth

Error: Artifact Bucket must have a KMS Key to add cross-account action 'Prepare' (pipeline account: '###PipelineAccount###', action account: '###TargetAccount###'). Create Pipeline with 'crossAccountKeys: true' (or pass an existing Bucket with a key)
    at Pipeline.getRoleFromActionPropsOrGenerateIfCrossAccount (C:\Users\vipetrul\source\repos\OSU\sp2-aws-account-tooling\cdk\node_modules\aws-cdk-lib\aws-codepipeline\lib\pipeline.js:1:12008)
    at Pipeline.getRoleForAction (C:\Users\vipetrul\source\repos\OSU\sp2-aws-account-tooling\cdk\node_modules\aws-cdk-lib\aws-codepipeline\lib\pipeline.js:1:11484)
    at Pipeline._attachActionToPipeline (C:\Users\vipetrul\source\repos\OSU\sp2-aws-account-tooling\cdk\node_modules\aws-cdk-lib\aws-codepipeline\lib\pipeline.js:1:7775)
    at Stage.attachActionToPipeline (C:\Users\vipetrul\source\repos\OSU\sp2-aws-account-tooling\cdk\node_modules\aws-cdk-lib\aws-codepipeline\lib\private\stage.js:1:3087)
    at Stage.addAction (C:\Users\vipetrul\source\repos\OSU\sp2-aws-account-tooling\cdk\node_modules\aws-cdk-lib\aws-codepipeline\lib\private\stage.js:1:1716)
    at Object.produceAction (C:\Users\vipetrul\source\repos\OSU\sp2-aws-account-tooling\cdk\node_modules\aws-cdk-lib\pipelines\lib\codepipeline\codepipeline.js:1:9194)
    at CodePipeline.pipelineStagesAndActionsFromGraph (C:\Users\vipetrul\source\repos\OSU\sp2-aws-account-tooling\cdk\node_modules\aws-cdk-lib\pipelines\lib\codepipeline\codepipeline.js:1:5932)
    at CodePipeline.doBuildPipeline (C:\Users\vipetrul\source\repos\OSU\sp2-aws-account-tooling\cdk\node_modules\aws-cdk-lib\pipelines\lib\codepipeline\codepipeline.js:1:4433)        
    at CodePipeline.buildPipeline (C:\Users\vipetrul\source\repos\OSU\sp2-aws-account-tooling\cdk\node_modules\aws-cdk-lib\pipelines\lib\main\pipeline-base.js:1:2258)
    at new PipelineStack (C:\Users\vipetrul\source\repos\OSU\sp2-aws-account-tooling\cdk\lib\pipeline-stack.ts:98:14)

Reproduction Steps

new PipelineStack(
  app,
  "SamplePipeline",
  {
    env: {
      account: "###CicdAccount###",
      region: "us-east-2",
    },
  },
);
const pipeline = new CodePipeline(this, "Pipeline", {
      pipelineName: "SamplePipeline",
      crossAccountKeys: false,
      artifactBucket: s3.Bucket.fromBucketAttributes(this, "ArtifactBucket", {
        bucketName: "artifacts-bucket-for-###TargetAccount###",
        encryptionKey: kms.Key.fromKeyArn(
          this,
          "ArtifactBucketKey",
          "###artifactsBucketKeyArn###",
        ),
      }),
      dockerEnabledForSynth: false,
      codeBuildDefaults: {
        buildEnvironment: {
          buildImage: cdk.aws_codebuild.LinuxBuildImage.STANDARD_6_0,
        },
      },
      synth: new CodeBuildStep("SynthStep", {
        input: CodePipelineSource.codeCommit(repo, props.branchName),
        buildEnvironment: { computeType: ComputeType.MEDIUM },
        primaryOutputDirectory: "cdk/cdk.out",
        commands: [
          //restore packages for CDK
          "cd cdk",
          "yarn install --frozen-lockfile",

          "npx cdk synth --context VERSION=$CODEBUILD_BUILD_NUMBER",
        ],
      }),
    });
["us-east-1","us-east-2"].forEach((region) => {
      const stage = new DeployStage(
        this,
        `Deploy.${region}`,
        props,
        {
          env: {
            account: "###TargetAccount###",
            region: region,
          },
        },
      );

      pipeline.addStage(stage);
    });

Possible Solution

No response

Additional Information/Context

When target account region matches pipeline region, then no error is raised.

Also tried to specify env on each individual stack within DeployStage, instead have it specified on DeployStagae, but still resulted in the same error.

CDK CLI Version

2.88.0 (build 5d497f9)

Framework Version

No response

Node.js Version

v16.16.0

OS

Windows 10

Language

Typescript

Language Version

TypeScript (5.1.6)

Other information

No response

pahud commented 1 year ago

The error comes from here:

https://github.com/aws/aws-cdk/blob/cb972325f01a73b35b2df7496c42bf448ef7246e/packages/aws-cdk-lib/aws-codepipeline/lib/pipeline.ts#L744-L755

But I wonder why crossAccountKeys: false in your case since the default is true?

https://github.com/aws/aws-cdk/blob/cb972325f01a73b35b2df7496c42bf448ef7246e/packages/aws-cdk-lib/aws-codepipeline/lib/pipeline.ts#L136-L151

vipetrul commented 1 year ago

Since bucket and corresponding KMS key are explicitly provided, I didn't want CDK to create a new KMS key on my behalf, hence crossAccountKeys: false.

Based on the source code link that you shared, it looks like there is another concept in play that deals with cross-region deployments (separate from cross-account deployments).

Need to investigate this further.

pahud commented 11 months ago

Hi

I am still working on this to figure out the solution. From what I've learned from the source code, if you are creating the pipeline with pipelines.CodePipeline class and its props, basically it does not allow you to specify an existing bucket with existing key for the remote region and looks like it always creates a new remote stack and bucket for you. However, if you look at codepipeline.Pipeline and its props, you are allowed to specify crossRegionReplicationBuckets and pass the self-created codepipeline to the codePipeline prop for pipelines.CodePipeline. This indicates it might be possible to use existing S3 bucket and MKS key for the codepipeline with cdk-pipelines. I am still trying to create a working sample with that but I hope this could be a workaround.

pahud commented 11 months ago

OK I made a working sample for cross-account and cross-region deployment using existing remote bucket and encryption key.

Let's say we have a pipeline in us-east-1 from account A deploying to ap-northeast-1 on account B.

In Account A at us-east-1, the CDK app looks like this:

import {
  App, Stack, StackProps, Stage, StageProps, CfnOutput,
  aws_dynamodb as dynamodb,
  aws_s3 as s3,
  aws_iam as iam,
  aws_kms as kms,
  aws_codepipeline as codepipeline,
  pipelines,
  RemovalPolicy,
} from 'aws-cdk-lib';
import { Construct } from 'constructs';

/** The stacks for our app are minimally defined here.  The internals of these
  * stacks aren't important, except that DatabaseStack exposes an attribute
  * "table" for a database table it defines, and ComputeStack accepts a reference
  * to this table in its properties.
  */
class DatabaseStack extends Stack {
  public readonly table: dynamodb.TableV2;

  constructor(scope: Construct, id: string) {
    super(scope, id);
    this.table = new dynamodb.TableV2(this, 'Table', {
      partitionKey: { name: 'id', type: dynamodb.AttributeType.STRING },
      removalPolicy: RemovalPolicy.DESTROY,
    });
  }
}

interface ComputeProps {
  readonly table: dynamodb.TableV2;
}

class ComputeStack extends Stack {
  constructor(scope: Construct, id: string, props: ComputeProps) {
    super(scope, id);

    new CfnOutput(this, 'TableName', { value: props.table.tableName });
  }
}

/**
 * Stack to hold the pipeline
 */
export class MyPipelineStack extends Stack {
  constructor(scope: Construct, id: string, props?: StackProps) {
    super(scope, id, props);

    const remoteBucketName = 'YOUR_REMOTE_BUCKET_NAME';
    const remoteEncryptionKey = kms.Key.fromKeyArn(this, 'NrtBucketKey', 'YOUR_KEY_ARN');
    const codePipeline = new codepipeline.Pipeline(this, 'MyPipeline', {
      role: this.createPipelineRole(),
      crossRegionReplicationBuckets: {
        'ap-northeast-1': s3.Bucket.fromBucketAttributes(this, 'NrtBucket', {
          account: AWS_ACCOUNT_B,
          bucketName: remoteBucketName,
          region: 'ap-northeast-1',
          encryptionKey: remoteEncryptionKey,
        }),
      },
    });

    const pipeline = new pipelines.CodePipeline(this, 'Pipeline', {
      codePipeline,
      synth: new pipelines.ShellStep('Synth', {
        // Use a connection created using the AWS console to authenticate to GitHub
        // Other sources are available.
    input: pipelines.CodePipelineSource.connection('pahud/demo-pipeline', 'main', {
          connectionArn, // Created using the AWS console * });',
        }),
        commands: [
          'yarn install --frozen-lockfile',
          // 'yarn build',
          'npx cdk synth',
        ],
      }),
    });

    // 'MyApplication' is defined below. Call `addStage` as many times as
    // necessary with any account and region (may be different from the
    // pipeline's).
    pipeline.addStage(new MyApplication(this, 'Prod', {
      env: {
        account: 'AWS_ACCOUNT_B',
        region: 'ap-northeast-1',
      },
    }));

    new CfnOutput(this, 'PipelineRoleOutput', { value: codePipeline.role.roleName });
    new CfnOutput(this, 'PipelineRoleArnOutput', { value: codePipeline.role.roleArn });
  }
  private bucketAndObjectsArns(bucketName: string): string[] {
    return [
      Stack.of(this).formatArn({
        service: 's3',
        account: '',
        region: '',
        resource: bucketName,
      }),
      Stack.of(this).formatArn({
        service: 's3',
        account: '',
        region: '',
        resource: bucketName,
        resourceName: '*',
      }),
    ];
  }
  private createPipelineRole(): iam.Role {
    const role = new iam.Role(this, 'PipelineRole', {
      assumedBy: new iam.ServicePrincipal('codepipeline.amazonaws.com'),
    });
    role.assumeRolePolicy?.addStatements(new iam.PolicyStatement({
      actions: ['sts:AssumeRole'],
      principals: [new iam.AccountRootPrincipal()],
    }));

    // pipeline role are allowed to publish to remote artifacts bucket
    role.addToPrincipalPolicy(new iam.PolicyStatement({
      actions: [
        's3:GetObject*',
        's3:GetBucket*',
        's3:List*',
        's3:DeleteObject*',
        's3:PutObject',
        's3:PutObjectLegalHold',
        's3:PutObjectRetention',
        's3:PutObjectTagging',
        's3:PutObjectVersionTagging',
        's3:Abort*',
      ],
      resources: this.bucketAndObjectsArns(remoteBucketName),
    }));
    return role;
  }
}

/**
 * Your application
 *
 * May consist of one or more Stacks (here, two)
 *
 * By declaring our DatabaseStack and our ComputeStack inside a Stage,
 * we make sure they are deployed together, or not at all.
 */
class MyApplication extends Stage {
  constructor(scope: Construct, id: string, props?: StageProps) {
    super(scope, id, props);

    const dbStack = new DatabaseStack(this, 'Database');
    new ComputeStack(this, 'Compute', {
      table: dbStack.table,
    });
  }
}

const devEnv = {
  account: process.env.CDK_DEFAULT_ACCOUNT,
  region: process.env.CDK_DEFAULT_REGION,
};

const app = new App();

new MyPipelineStack(app, 'PipelineStack', {
  env: {
    account: devEnv.account,
    region: 'us-east-1',
  },
});

app.synth();

And for Account B in ap-northeast-1

import {
  Stack, Aws,
  App, CfnOutput,
  aws_s3 as s3,
  aws_kms as kms,
  aws_iam as iam,
  RemovalPolicy,
} from 'aws-cdk-lib';
import { Construct } from 'constructs';

function bucketAndObjectsArns(scope: Construct, bucketName: string): string[] {
  return [
    Stack.of(scope).formatArn({
      service: 's3',
      account: '',
      region: '',
      resource: bucketName,
    }),
    Stack.of(scope).formatArn({
      service: 's3',
      account: '',
      region: '',
      resource: bucketName,
      resourceName: '*',
    }),
  ];
};

const app = new App();
const stack = new Stack(app, 'NRTS3Stack', {
  env: {
    account: AWS_ACCOUNT_B,
    region: 'ap-northeast-1',
  },
});

const encryptionKey = new kms.Key(stack, 'EncryptKey', {
  alias: 'cdkpipeline-remote-bucket-key',
  removalPolicy: RemovalPolicy.DESTROY,
});

const bucket = new s3.Bucket(stack, 'NRTS3Bucket', {
  encryptionKey,
});

// grant bucket read-write access to the pipelineRole
const pipelineRoleArn = PIPELINE_ROLE_ARN_FROM_ACCOUNT_A;

bucket.grantReadWrite(iam.Role.fromRoleArn(stack, 'PipelineRole', pipelineRoleArn));
new CfnOutput(stack, 'BucketName', { value: bucket.bucketName });
new CfnOutput(stack, 'KeyArn', { value: encryptionKey.keyArn });

// allow the NRT deploy-role to access the artifacts bucket
const deployRole = iam.Role.fromRoleName(stack, 'deployRole', `cdk-hnb659fds-deploy-role-${Aws.ACCOUNT_ID}-${Aws.REGION}`);
deployRole.addToPrincipalPolicy(new iam.PolicyStatement({
  actions: [
    's3:GetObject*',
    's3:GetBucket*',
    's3:List*',
  ],
  resources: bucketAndObjectsArns(stack, bucket.bucketName),
  sid: 'PipelineStagingBucket',
}));

Now your pipeline should be able to make a cross-account and cross-region deployment with existing remote bucket and encryption key.

image
github-actions[bot] commented 11 months ago

This issue has not received a response in a while. If you want to keep this issue open, please leave a comment below and auto-close will be canceled.

amonigal commented 3 weeks ago

I am facing this same issue. Very hard to find success stories of this setup on the web.

The suggested solution does not work. Not sure how it even ran for @pahud. It's a circular reference scenario. Account_A Stack is referencing Account_B Stack resources and vise verse. Whichever Stack you run first will fail.

Anyways, I ran into a diff issue with the suggested solution. Specifically these lines..

[Stack in Account_B and Region_B]
// grant bucket read-write access to the pipelineRole
const pipelineRoleArn = PIPELINE_ROLE_ARN_FROM_ACCOUNT_A;

bucket.grantReadWrite(iam.Role.fromRoleArn(stack, 'PipelineRole', pipelineRoleArn)); ---> ISSUE/ERROR

pipelineRole exists in Account_A and Region_A... which is not findable using CDK .fromRoleArn(Account_A_Role_Arn). CDK lookup is automatically limited to the env its running the Stack with.... which is Account_B (Region_B). So it will fail or find nothing and you are left with an error: Policy contains a statement with one or more invalid principals.

Anyways, I will continue vetting a solution as this seems like a big mess using CDK.