awslabs / aws-deployment-framework

The AWS Deployment Framework (ADF) is an extensive and flexible framework to manage and deploy resources across multiple AWS accounts and regions based on AWS Organizations.
Apache License 2.0
668 stars 226 forks source link

Allow override of the TemplateConfiguration for the CloudFormation provider #449

Open Nr18 opened 2 years ago

Nr18 commented 2 years ago

When using the ServiceCatalog as a deployment provider you vane the ability to supply a configuration_file_path. When you use the generate_params.py functionality with a single template in the pipeline you will not have any issues.

But when you have multiple templates that you would like to deploy as part as a single deployment and the CloudFormation Paramaters of those stacks are not equal you will run into issues.

By supporting the configuration_file_path for a CloudFormation deployment target you would make it possible to specify specific configurations for these "other" stacks.

A use case for this is combining a delegation role in the master account and the implementation of the service in another account. By keeping them in a single pipeline and repo you make it easy to maintain and the execution order is guaranteed.

sbkok commented 2 years ago

If I get it right, you are using the CloudFormation provider, and you want to have multiple stacks within the same repository.

Did you consider using the Mono Repo structure, as explained here? https://github.com/awslabs/aws-deployment-framework/tree/0723ddf4eaf55888ae780dc48873f0ec4766cfbd/samples/sample-mono-repo

The root_dir property allows you to specify which stack you are deploying. Storing each stack in a separate directory. This way can use one pipeline per stack, or combine all in a single stack.

If you combine all in a single pipeline, you would change directories into each stack, run the generate_params.py script and go to the next directory. For the deployment targets, you can set the root_dir per target.

Would this resolve your issue?

Nr18 commented 2 years ago

Hi @sbkok,

Yes I did had a look at the monorepo example, but that requires 2 pipelines and then you cannot pass the output of the first CloudFormation stack to the next.

I tried setting the root_dir and executing the generate_params.py but this failed. I will reproduce it and post the details, if I remember correctly it had import issues when executed. So I figured it was not a supported scenario. But from your reaction I assume it should work? In that case I might have found a bug then.

I will come back to this!

Nr18 commented 2 years ago
[Container] 2022/03/11 11:55:16 Running command cd ./subscription
[Container] 2022/03/11 11:55:16 Running command python ../adf-build/generate_params.py
Traceback (most recent call last):
File "../adf-build/generate_params.py", line 17, in <module>
from resolver import Resolver
File "/codebuild/output/src140950311/src/adf-build/resolver.py", line 11, in <module>
from s3 import S3
ModuleNotFoundError: No module named 's3'
 
[Container] 2022/03/11 11:55:16 Command did not exit successfully python ../adf-build/generate_params.py exit status 1
[Container] 2022/03/11 11:55:16 Phase complete: BUILD State: FAILED
[Container] 2022/03/11 11:55:16 Phase context status code: COMMAND_EXECUTION_ERROR Message: Error while executing command: python ../adf-build/generate_params.py. Reason: exit status 1

Ah yes, that was it indeed. because the cwd has changed the s3 module could not be found. The following command:

PYTHONPATH=../adf-build/python python ../adf-build/generate_params.py

Would solve that.... 🤔 What approach do you prefer here @sbkok? The PYTHONPATH path is set on project level. We need a absolute path in order to get this working...

Nr18 commented 2 years ago

One solution would be replacing/appending the ${LAMBDA_TASK_ROOT}/adf-build/python to the PYTHONPATH. This will then be part of the generate_params.py script. Would there be any concern doing this? If not I am happy to change the PR into that + the needed documentation.