Closed patrickmoore-nc closed 2 years ago
I see in the new code that the buildspec should look like this, with pre- and post-API-helper scripts: https://github.com/aws-ia/terraform-aws-control_tower_account_factory/blob/main/modules/aft-customizations/buildspecs/aft-global-customizations-terraform.yml
For reference, here's mine:
# Copyright Amazon.com, Inc. or its affiliates. All rights reserved.
# SPDX-License-Identifier: Apache-2.0
#
version: 0.2
phases:
pre_build:
commands:
- DEFAULT_PATH=$(pwd)
- TIMESTAMP=$(date '+%Y-%m-%d %H:%M:%S')
- AWS_MODULE_SOURCE=$(aws ssm get-parameter --name "/aft/config/aft-pipeline-code-source/repo-url" --query "Parameter.Value" --output text)
- AWS_MODULE_GIT_REF=$(aws ssm get-parameter --name "/aft/config/aft-pipeline-code-source/repo-git-ref" --query "Parameter.Value" --output text)
- TF_VERSION=$(aws ssm get-parameter --name "/aft/config/terraform/version" --query "Parameter.Value" --output text)
- TF_DISTRIBUTION=$(aws ssm get-parameter --name "/aft/config/terraform/distribution" --query "Parameter.Value" --output text)
- CT_MGMT_REGION=$(aws ssm get-parameter --name "/aft/config/ct-management-region" --query "Parameter.Value" --output text)
- AFT_MGMT_ACCOUNT=$(aws ssm get-parameter --name "/aft/account/aft-management/account-id" --query "Parameter.Value" --output text)
- AFT_EXEC_ROLE_ARN=arn:aws:iam::$AFT_MGMT_ACCOUNT:role/AWSAFTExecution
- VENDED_EXEC_ROLE_ARN=arn:aws:iam::$VENDED_ACCOUNT_ID:role/AWSAFTExecution
- AFT_ADMIN_ROLE_NAME=$(aws ssm get-parameter --name /aft/resources/iam/aft-administrator-role-name | jq --raw-output ".Parameter.Value")
- AFT_ADMIN_ROLE_ARN=arn:aws:iam::$AFT_MGMT_ACCOUNT:role/$AFT_ADMIN_ROLE_NAME
- ROLE_SESSION_NAME=$(aws ssm get-parameter --name /aft/resources/iam/aft-session-name | jq --raw-output ".Parameter.Value")
- |
ssh_key_parameter=$(aws ssm get-parameter --name /aft/config/aft-ssh-key --with-decryption 2> /dev/null || echo "None")
if [[ $ssh_key_parameter != "None" ]]; then
ssh_key=$(jq --raw-output ".Parameter.Value" <<< $ssh_key_parameter)
mkdir -p ~/.ssh
echo "Host *" >> ~/.ssh/config
echo "StrictHostKeyChecking no" >> ~/.ssh/config
echo "UserKnownHostsFile=/dev/null" >> ~/.ssh/config
echo "$ssh_key" > ~/.ssh/ssh_key
echo -e "\n\n" >> ~/.ssh/ssh_key
chmod 600 ~/.ssh/ssh_key
eval "$(ssh-agent -s)"
ssh-add ~/.ssh/ssh_key
fi
- git config --global credential.helper '!aws codecommit credential-helper $@'
- git config --global credential.UseHttpPath true
- git clone -b $AWS_MODULE_GIT_REF $AWS_MODULE_SOURCE aws-aft-core-framework
- python3 -m venv ./venv
- source ./venv/bin/activate
- pip install jinja2-cli==0.7.0 Jinja2==3.0.1 MarkupSafe==2.0.1 boto3==1.18.56 requests==2.26.0
- |
if [ $TF_DISTRIBUTION = "oss" ]; then
TF_BACKEND_REGION=$(aws ssm get-parameter --name "/aft/config/oss-backend/primary-region" --query "Parameter.Value" --output text)
TF_KMS_KEY_ID=$(aws ssm get-parameter --name "/aft/config/oss-backend/kms-key-id" --query "Parameter.Value" --output text)
TF_DDB_TABLE=$(aws ssm get-parameter --name "/aft/config/oss-backend/table-id" --query "Parameter.Value" --output text)
TF_S3_BUCKET=$(aws ssm get-parameter --name "/aft/config/oss-backend/bucket-id" --query "Parameter.Value" --output text)
TF_S3_KEY=$VENDED_ACCOUNT_ID-aft-global-customizations/terraform.tfstate
cd /tmp
echo "Installing Terraform"
curl -o terraform_${TF_VERSION}_linux_amd64.zip https://releases.hashicorp.com/terraform/${TF_VERSION}/terraform_${TF_VERSION}_linux_amd64.zip
unzip -o terraform_${TF_VERSION}_linux_amd64.zip && mv terraform /usr/bin
terraform --version
cd $DEFAULT_PATH/terraform
for f in *.jinja; do jinja2 $f -D timestamp="$TIMESTAMP" -D tf_distribution_type=$TF_DISTRIBUTION -D provider_region=$CT_MGMT_REGION -D region=$TF_BACKEND_REGION -D aft_admin_role_arn=$AFT_EXEC_ROLE_ARN -D target_admin_role_arn=$VENDED_EXEC_ROLE_ARN -D bucket=$TF_S3_BUCKET -D key=$TF_S3_KEY -D dynamodb_table=$TF_DDB_TABLE -D kms_key_id=$TF_KMS_KEY_ID >> ./$(basename $f .jinja).tf; done
for f in *.tf; do echo "\n \n"; echo $f; cat $f; done
JSON=$(aws sts assume-role --role-arn ${AFT_ADMIN_ROLE_ARN} --role-session-name ${ROLE_SESSION_NAME})
#Make newly assumed role default session
export AWS_ACCESS_KEY_ID=$(echo ${JSON} | jq --raw-output ".Credentials[\"AccessKeyId\"]")
export AWS_SECRET_ACCESS_KEY=$(echo ${JSON} | jq --raw-output ".Credentials[\"SecretAccessKey\"]")
export AWS_SESSION_TOKEN=$(echo ${JSON} | jq --raw-output ".Credentials[\"SessionToken\"]")
terraform init
else
TF_BACKEND_REGION=$(aws ssm get-parameter --name "/aft/config/oss-backend/primary-region" --query "Parameter.Value" --output text)
TF_ORG_NAME=$(aws ssm get-parameter --name "/aft/config/terraform/org-name" --query "Parameter.Value" --output text)
TF_TOKEN=$(aws ssm get-parameter --name "/aft/config/terraform/token" --with-decryption --query "Parameter.Value" --output text)
TF_ENDPOINT=$(aws ssm get-parameter --name "/aft/config/terraform/api-endpoint" --query "Parameter.Value" --output text)
TF_WORKSPACE_NAME=$VENDED_ACCOUNT_ID-aft-global-customizations
TF_CONFIG_PATH="./temp_configuration_file.tar.gz"
cd $DEFAULT_PATH/terraform
for f in *.jinja; do jinja2 $f -D timestamp="$TIMESTAMP" -D provider_region=$CT_MGMT_REGION -D tf_distribution_type=$TF_DISTRIBUTION -D aft_admin_role_arn=$AFT_EXEC_ROLE_ARN -D target_admin_role_arn=$VENDED_EXEC_ROLE_ARN -D terraform_org_name=$TF_ORG_NAME -D terraform_workspace_name=$TF_WORKSPACE_NAME >> ./$(basename $f .jinja).tf; done
for f in *.tf; do echo "\n \n"; echo $f; cat $f; done
cd $DEFAULT_PATH
tar -czf temp_configuration_file.tar.gz -C terraform --exclude .git --exclude venv .
python3 $DEFAULT_PATH/aws-aft-core-framework/sources/scripts/workspace_manager.py --operation "deploy" --organization_name $TF_ORG_NAME --workspace_name $TF_WORKSPACE_NAME --assume_role_arn $AFT_ADMIN_ROLE_ARN --assume_role_session_name $ROLE_SESSION_NAME --api_endpoint $TF_ENDPOINT --api_token $TF_TOKEN --terraform_version $TF_VERSION --config_file $TF_CONFIG_PATH
fi
build:
commands:
- |
if [ $TF_DISTRIBUTION = "oss" ]; then
terraform apply --auto-approve
fi
I could potentially fix up the buildspec in the broken pipelines but this presents several further questions:
Hey @patrickmoore-nc, thanks for reaching out about this. We've just gone ahead and pinned an issue in regards to this, https://github.com/aws-ia/terraform-aws-control_tower_account_factory/issues/205, that suggests a few ways to address the issue you've brought up here.
The customizations pipelines themselves do not get automatically updated at AFT deployment/upgrade time. For AFT versions older than 1.5.0, customizations pipelines are only redeployed upon an account request. For AFT versions 1.5.0 and newer, both account request, and customizations will invoke a CodeBuild container to redeploy the $ACCOUNT_ID-customizations-pipeline
.
Closing this issue for now, please feel free to reach out in case of further questions / concerns.
Terraform Version & Prov:
AFT Version: Observed this change while running v1.3.5 even though it had not been updated beyond that version. I see that there was a defect, fixed in v1.3.6 which allowed some AFT components to run the latest version regardless of the version installed. So it looks like the v.1.5.0 pipeline changes partially updated this v1.3.5 deployment. However, only some of the vended accounts' pipelines were affected (not all). Have updated to AFT v.1.5.1, issue still persists.
Terraform Version & Provider Versions Please provide the outputs of
terraform version
andterraform providers
from within your AFT environmentterraform version
terraform providers
Bug Description New merged Customizations pipelines' buildspecs are missing pre- and post-API-helpers shell scripts. It seems that only the Terraform apply job has been retained. My AFT-based solution used these hooks to drive further automations which are now broken.
Also, only some of the pipelines for vended accounts seem to have been updated, but not all.
To Reproduce Steps to reproduce the behavior: In AFT account, find the CodePipeline for a vended account, and see one that has been merged down to a single build job (from three jobs). Not all vended account pipelines have been modified, so it's not consistent. See the buildspec. It is only running terraform apply. There is no launch of the pre- and post-API-helpers shell scripts.
Expected behavior Pre- and post-api shell scripts should be launched during the customizations pipeline: https://github.com/aws-ia/terraform-aws-control_tower_account_factory/tree/main/sources/aft-customizations-repos/aft-global-customizations/api_helpers
Related Logs Provide any related logs or error messages to help explain your problem.
Additional context This problem affected this install without even updating to v1.5.0 (the version that changed the customization pipeline). This sets a very poor reliability precedent. AFT can break without customers even actively deploying a new version.