We need to perform a timeboxed spike as part of our OKR into getting Airflow running and being able to properly utilise our environments for testing.
The question is: How can different configurations be deployed to different environments?
Tech Approach
Organise a call with OE when ticket picked up to discuss expectations
currently the code which downloads the configuration files is in the makerules repo. In the pipeline.mk file.
right now it downloads from the main branch of the config repo (via the raw GitHub url) if the file isn't found there then the production collection archive bucket is used.
the easiest way to deploy into different environments is to only get values from the s3 buckets.
to support this you will need to build CI/CD pipelines to test/validate the configuration files and push them up to each of the environments.
this will require the devops team to add environments with the relevant permissions to the config repo
If possible lock permissions to only after the config directory.
Acceptance Criteria
Tickets written up for future development to allow this to be configured.
a member of the DM team must be able to deploy a branch of the config repo to the development environment
deployment to staging and production must happen from the main branch
deployment to main should have a list of approvers
who should the list of approvers be?
Tasks
[ ] Setup CI environment for config repository using Terraform
[ ] Add files CDN base URL as Airflow custom parameter using Terraform
[ ] config <- publish to S3 bucket per environment
[ ] makerules <- download config from AWS bucket instead - use environment variable for correct environment/bucket
[ ] airflow-dags <- pass environment variable if not already done
Overview
We need to perform a timeboxed spike as part of our OKR into getting Airflow running and being able to properly utilise our environments for testing.
The question is: How can different configurations be deployed to different environments?
Tech Approach
Acceptance Criteria
Tasks