aws / aws-cdk

The AWS Cloud Development Kit is a framework for defining cloud infrastructure in code
https://aws.amazon.com/cdk
Apache License 2.0
11.5k stars 3.85k forks source link

[sam] SAR Applications from CDK #3797

Open rhboyd opened 5 years ago

rhboyd commented 5 years ago

:rocket: Feature Request

General Information

Description

The aws-delivlib has some pretty well built abstractions in place for building a CI/CD pipeline. I'd like to be able to re-use that for building/testing/publishing my Lambda Layers. The current support for Semantic Versioning in Layers is a bit rough and some people have started publishing their layers as SAR Applications which solves a few problems. First, a published SAR App is immediately available in every region (while a Layer Version has to be explicitly deployed to every region where you want it offered). Second, a SAR App creates a local copy of the Layer in the account that consumes the App. This isolates the client from disruptions caused by a layer being unpublished by a provider, which means customers will have greater trust in the Public SAR App and Lambda Layer ecosystem.

Proposed Solution

This feels like it's going to be REALLY hard to do, but it'll also add a lot of SAM coverage to CDK.

Environment

All of them?

Other information

https://serverless.pub/sar-layers/

eladb commented 5 years ago

I like this a lot. I've been also thinking about SAR as a great way to vend Lambda Layers that are available across all regions. I think SAR does a great job in this area.

Still not sure I can wrap my head exactly around what would be the developer experience. Maybe we can work backwards from the developer experience. Can you try to articulate it?

rhboyd commented 5 years ago

I'm playing with 2 ideas.

Developer checks in a bash script that does all the building needed to set up the layer's data in a directory. (e.g. pip install -r requirements.txt -t ./build) as well as any testing they want to do (e.g. run a python app that pulls in those dependencies and does some business logic using them)

#!/bin/bash
set -eu

# Developer uses this bash script to build/install dependencies and set up 
# the contents of the Layer in a specific directory

yum install -y ....... [some dependencies here]
python3 ./some_file.py
# etc.....
# all artifacts end up in ./build directory

# Here they define what testing they want to do, could/should probably 
# separate these two efforts into build then test
cd ./tests
python3 -m venv .env
source .env/bin/activate
pip install mypackage --no-index --find-links file:///build/
pytest

Assuming these tests pass, we zip up the ./build directory and stuff it into S3 from CodeBuild.

One of the delivlib Pipeline stages is publishToSar() that will consume a metadata.yaml file from the build artifacts and the Stage Construct will build a Lambda Layer resource in a cfn template and add the appropriate metadata info, then the CodeBuild Project for this stage runs sam publish to actually publish the App, which also handles the local path -> S3 bucket/key substitution.

Second approach is that the developer checks in the zipped artifact, which the Build step unzips, tests, then passes/fails. The Pipeline Stage should be the same across both approaches.

I'm still chewing on the idea so there's almost certainly some gotchas I haven't seen yet.