astronomer / astronomer-cosmos

Run your dbt Core projects as Apache Airflow DAGs and Task Groups with a few lines of code
https://astronomer.github.io/astronomer-cosmos/
Apache License 2.0
521 stars 132 forks source link

DBT Docs generate allow for execution config #1066

Closed DanMawdsleyBA closed 2 weeks ago

DanMawdsleyBA commented 2 weeks ago

Description

As part of the dbtdocsoperator there isn't an option to specify the execution config unlike with the task group or DAG. This means that if dbt is install on venv then it is not able to work.

Also the connection_id is mandatory but I am able to connect to S3 for other tasks without specifying the connection id so can this be made optional? (Seems to fail on the pydantic validation)

Use case/motivation

Allows for more use of the dbt docs generator.

Related issues

No response

Are you willing to submit a PR?

DanMawdsleyBA commented 2 weeks ago

Was able to get it working now. You can still use the dbt_executable_path and just use a dummy value for the connection_id.

generate_dbt_docs_aws = DbtDocsS3Operator(
task_id="generate_dbt_docs_aws",
project_dir=f"{os.environ['AIRFLOW_HOME']}/dags/shared/dbt",
profile_config=profile_config,
dbt_executable_path=f"{os.environ['AIRFLOW_HOME']}/dbt_venv/bin/dbt",
# docs-specific arguments
connection_id="dummy",
bucket_name=f"bucket",
folder_dir=f"folder")