Open EugenioG2021 opened 5 months ago
🤖
Based on the information you've provided and the context from the Cosmos repository, the target/run
directory, which contains the compiled SQL, should typically be located within your dbt project directory. In your case, it should be within the data_eng_dbt
directory, as specified in your ProjectConfig
.
However, if the target/run
directory is not found, it could be due to a few reasons:
The dbt project has not been run yet. The target/run
directory is created when a dbt project is run. If the project has not been run, the directory will not exist.
The dbt project directory is not correctly specified. In your case, the dbt project directory is specified as /usr/local/airflow/dags/dbt/data_eng_dbt
. Make sure that this path correctly points to your dbt project directory.
The dbt project directory does not have the correct permissions. Make sure that the user running the Airflow DAG has read and write permissions to the dbt project directory.
If none of these solutions work, you may need to manually create the target/run
directory within your dbt project directory.
Regarding your question about how Cosmos handles the target-path
configuration from the dbt_project.yml
file, I wasn't able to find a specific answer within the repository. It's possible that this information is available elsewhere or I may have missed it.
If you have any additional information or context that could help me better understand your issue, please provide it. I'm here to help!
@EugenioG2021 Cosmos currently creates temporary directories to export the target_directory. This was done intentionally since other users faced issues with read-only file systems. That said, I agree that we should allow users to specify the target directory if they want to control this.
Would you be interested in contributing this feature?
I have ran a model in airflow, and it says the compiled sql can be found at some target/{some_subdirectory} path. However, I am not seeing any "target" directory created in neither of these places:
1.In the project directory (
dbt_project_path
argument ofProjectConfig
on my DAG python file)dags
directoryExecutionConfig
with thedbt_executable_path
This is my DbtTaskGroup which I use in the airflow DAG :
On the other hand, my dbt_project.yml is as follows:
My project scafolding starting at airflow's Directory is:
When airflow dag's run and I had an error on my sql of a model called
intermediate.int_unite_factevents_pstage
, it says my compiled sql should be atcompiled Code at target/run/data_eng_dbt/models/intermediate/placement_id_match/int_unite_factevents_pstage.sql'
However, I cannot find that
target/run
directory anywhere, where should it be? And does thedbt_project.yml
comes into play here? Because I did specified the target-path there.I also have no system environment variables for dbt_target as mentioned here and I run that DbtTaskGroup instance by just putting it inside a standard airflow dag