The code in this repository is meant to accompany this blog post on beginner and advanced implementation concepts at the intersection of dbt and Airflow.
cd
into it.astro dev start
to spin up a local Airflow environment and run the accompanying DAGs on your machine.We are currently using the jaffle_shop sample dbt project.
The only files required for the Airflow DAGs to run are dbt_project.yml
, profiles.yml
and target/manifest.json
, but we included the models for completeness. If you would like to try these DAGs with your own dbt workflow, feel free to drop in your own project files.
dbt compile
in order to update the manifest.json
file.
This may be done manually during development, as part of a CI/CD pipeline, or as a separate step in a production pipeline
run before the Airflow DAG is triggered.profiles.yml
, which is configured to use environment variables. The
database credentials from an Airflow connection are passed as environment variables to the BashOperator
tasks running the dbt commands.dbt_seed
task at the beginning that loads sample data into the database. This is simply for the purpose of this demo.