Open ysjjovo opened 2 weeks ago
To address the dependency conflict issue with astronomer-cosmos[dbt-redshift]==1.3.2
in MWAA 2.8.1, follow these steps:
Isolate dbt in a Separate Virtual Environment: Use MWAA's startup script to create a virtual environment specifically for dbt. This approach allows you to manage dbt and its dependencies separately from the main environment, avoiding conflicts.
#!/bin/sh
export DBT_VENV_PATH="${AIRFLOW_HOME}/dbt_venv"
export PIP_USER=false
python3 -m venv "${DBT_VENV_PATH}"
${DBT_VENV_PATH}/bin/pip install dbt-redshift
export PIP_USER=true
Modify Your requirements.txt
: List astronomer-cosmos
without the dbt-redshift
extra to prevent automatic installation of conflicting versions.
astronomer-cosmos
Manually Adjust Dependency Versions: After setting up the virtual environment, manually install or adjust the versions of redshift-connector
or boto3
to ensure compatibility.
${DBT_VENV_PATH}/bin/pip install 'redshift-connector==2.0.918' 'boto3==1.33.13'
Point to the Correct dbt Executable: In your DAG, use the dbt_executable_path
argument to specify the path to the dbt executable within the virtual environment.
execution_config = ExecutionConfig(
dbt_executable_path=f"{os.environ['AIRFLOW_HOME']}/dbt_venv/bin/dbt",
)
This setup ensures that dbt and its dependencies are managed in isolation, mitigating the conflict and allowing for a smoother integration with MWAA.
To continue the conversation, mention @dosu.
To address the dependency conflict issue with
astronomer-cosmos[dbt-redshift]==1.3.2
in MWAA 2.8.1, follow these steps:
- Isolate dbt in a Separate Virtual Environment: Use MWAA's startup script to create a virtual environment specifically for dbt. This approach allows you to manage dbt and its dependencies separately from the main environment, avoiding conflicts.
#!/bin/sh export DBT_VENV_PATH="${AIRFLOW_HOME}/dbt_venv" export PIP_USER=false python3 -m venv "${DBT_VENV_PATH}" ${DBT_VENV_PATH}/bin/pip install dbt-redshift export PIP_USER=true
- Modify Your
requirements.txt
: Listastronomer-cosmos
without thedbt-redshift
extra to prevent automatic installation of conflicting versions.astronomer-cosmos
- Manually Adjust Dependency Versions: After setting up the virtual environment, manually install or adjust the versions of
redshift-connector
orboto3
to ensure compatibility.${DBT_VENV_PATH}/bin/pip install 'redshift-connector==2.0.918' 'boto3==1.33.13'
- Point to the Correct dbt Executable: In your DAG, use the
dbt_executable_path
argument to specify the path to the dbt executable within the virtual environment.execution_config = ExecutionConfig( dbt_executable_path=f"{os.environ['AIRFLOW_HOME']}/dbt_venv/bin/dbt", )
This setup ensures that dbt and its dependencies are managed in isolation, mitigating the conflict and allowing for a smoother integration with MWAA.
Details
To continue the conversation, mention @dosu.
Great!,It works to me.But now I got a new problem.my execution mode is kubernetes(eks 1.27),task seems not running on k8s.Pod status changes from Pending to Completed takes 0s.
➜ ~ kubectl get po -A -w
NAMESPACE NAME READY STATUS RESTARTS AGE
kube-system aws-node-62v44 2/2 Running 0 5h48m
kube-system coredns-6dfd46db99-mszjc 1/1 Running 0 5h54m
kube-system coredns-6dfd46db99-xfrls 1/1 Running 0 5h54m
kube-system kube-proxy-c8vs2 1/1 Running 0 5h48m
mwaa dbt-task-group-base-public-category-run-4hrk5hmz 0/1 Completed 0 10s
mwaa dbt-task-group-base-public-event-run-94r4a8hr 0/1 Completed 0 9s
mwaa dbt-task-group-base-public-sales-run-6o1no285 0/1 ContainerCreating 0 7s
mwaa dbt-task-group-base-public-users-run-51toaaz5 0/1 Completed 0 7s
mwaa dbt-task-group-my-first-dbt-model-run-ouja4593 0/1 Completed 0 7s
mwaa dbt-task-group-base-public-users-run-51toaaz5 0/1 Completed 0 9s
mwaa dbt-task-group-my-first-dbt-model-run-ouja4593 0/1 Completed 0 9s
mwaa dbt-task-group-base-public-category-run-4hrk5hmz 0/1 Completed 0 12s
mwaa dbt-task-group-base-public-event-run-94r4a8hr 0/1 Completed 0 11s
mwaa dbt-task-group-base-public-users-run-51toaaz5 0/1 Completed 0 9s
mwaa dbt-task-group-base-public-category-run-4hrk5hmz 0/1 Completed 0 12s
mwaa dbt-task-group-my-first-dbt-model-run-ouja4593 0/1 Completed 0 9s
mwaa dbt-task-group-base-public-event-run-94r4a8hr 0/1 Completed 0 11s
mwaa dbt-task-group-base-public-category-run-4hrk5hmz 0/1 Terminating 0 13s
mwaa dbt-task-group-base-public-category-run-4hrk5hmz 0/1 Terminating 0 13s
mwaa dbt-task-group-base-public-event-run-94r4a8hr 0/1 Terminating 0 12s
mwaa dbt-task-group-base-public-event-run-94r4a8hr 0/1 Terminating 0 12s
mwaa dbt-task-group-base-public-users-run-51toaaz5 0/1 Terminating 0 10s
mwaa dbt-task-group-base-public-users-run-51toaaz5 0/1 Terminating 0 10s
mwaa dbt-task-group-my-first-dbt-model-run-ouja4593 0/1 Terminating 0 10s
mwaa dbt-task-group-my-first-dbt-model-run-ouja4593 0/1 Terminating 0 10s
mwaa dbt-task-group-base-public-sales-run-6o1no285 0/1 Completed 0 14s
mwaa dbt-task-group-base-public-sales-run-6o1no285 0/1 Completed 0 15s
mwaa dbt-task-group-base-public-sales-run-6o1no285 0/1 Terminating 0 16s
mwaa dbt-task-group-base-public-sales-run-6o1no285 0/1 Terminating 0 16s
mwaa dbt-task-group-base-public-listing-run-rt4ofxra 0/1 Pending 0 0s
mwaa dbt-task-group-base-public-listing-run-rt4ofxra 0/1 Pending 0 0s
mwaa dbt-task-group-base-public-listing-run-rt4ofxra 0/1 ContainerCreating 0 0s
mwaa dbt-task-group-base-public-listing-run-rt4ofxra 0/1 Completed 0 1s
mwaa dbt-task-group-my-first-dbt-model-test-lapl1fwq 0/1 Pending 0 0s
mwaa dbt-task-group-my-first-dbt-model-test-lapl1fwq 0/1 Pending 0 0s
mwaa dbt-task-group-my-first-dbt-model-test-lapl1fwq 0/1 ContainerCreating 0 0s
mwaa dbt-task-group-my-first-dbt-model-test-lapl1fwq 0/1 Completed 0 1s
mwaa dbt-task-group-base-public-listing-run-rt4ofxra 0/1 Completed 0 2s
mwaa dbt-task-group-base-public-date-run-qabzl5z5 0/1 Pending 0 0s
mwaa dbt-task-group-base-public-date-run-qabzl5z5 0/1 Pending 0 0s
mwaa dbt-task-group-base-public-date-run-qabzl5z5 0/1 ContainerCreating 0 1s
mwaa dbt-task-group-base-public-venue-run-b9ke290z 0/1 Pending 0 0s
mwaa dbt-task-group-base-public-venue-run-b9ke290z 0/1 Pending 0 0s
mwaa dbt-task-group-base-public-venue-run-b9ke290z 0/1 ContainerCreating 0 0s
mwaa dbt-task-group-base-public-listing-run-rt4ofxra 0/1 Completed 0 3s
mwaa dbt-task-group-base-public-date-run-qabzl5z5 0/1 Completed 0 1s
mwaa dbt-task-group-my-first-dbt-model-test-lapl1fwq 0/1 Completed 0 2s
mwaa dbt-task-group-base-public-listing-run-rt4ofxra 0/1 Terminating 0 4s
mwaa dbt-task-group-base-public-listing-run-rt4ofxra 0/1 Terminating 0 4s
mwaa dbt-task-group-rpt-finance-qtr-total-sales-by-event-ru-1envlgl2 0/1 Pending 0 0s
mwaa dbt-task-group-rpt-finance-qtr-total-sales-by-event-ru-1envlgl2 0/1 Pending 0 0s
mwaa dbt-task-group-rpt-finance-qtr-total-sales-by-event-ru-1envlgl2 0/1 ContainerCreating 0 0s
mwaa dbt-task-group-my-first-dbt-model-test-lapl1fwq 0/1 Terminating 0 3s
mwaa dbt-task-group-my-first-dbt-model-test-lapl1fwq 0/1 Terminating 0 3s
mwaa dbt-task-group-base-public-date-run-qabzl5z5 0/1 Completed 0 2s
mwaa dbt-task-group-base-public-date-run-qabzl5z5 0/1 Terminating 0 3s
mwaa dbt-task-group-base-public-date-run-qabzl5z5 0/1 Terminating 0 3s
mwaa dbt-task-group-my-second-dbt-model-run-7a3me2r9 0/1 Pending 0 0s
mwaa dbt-task-group-my-second-dbt-model-run-7a3me2r9 0/1 Pending 0 0s
mwaa dbt-task-group-my-second-dbt-model-run-7a3me2r9 0/1 ContainerCreating 0 0s
mwaa dbt-task-group-my-second-dbt-model-run-7a3me2r9 0/1 Completed 0 1s
mwaa dbt-task-group-my-second-dbt-model-run-7a3me2r9 0/1 Completed 0 2s
mwaa dbt-task-group-my-second-dbt-model-run-7a3me2r9 0/1 Terminating 0 3s
mwaa dbt-task-group-my-second-dbt-model-run-7a3me2r9 0/1 Terminating 0 3s
mwaa dbt-task-group-my-second-dbt-model-test-knk3wih3 0/1 Pending 0 0s
mwaa dbt-task-group-my-second-dbt-model-test-knk3wih3 0/1 Pending 0 0s
mwaa dbt-task-group-my-second-dbt-model-test-knk3wih3 0/1 ContainerCreating 0 0s
mwaa dbt-task-group-my-second-dbt-model-test-knk3wih3 0/1 Completed 0 13s
mwaa dbt-task-group-my-second-dbt-model-test-knk3wih3 0/1 Completed 0 15s
mwaa dbt-task-group-my-second-dbt-model-test-knk3wih3 0/1 Terminating 0 15s
mwaa dbt-task-group-my-second-dbt-model-test-knk3wih3 0/1 Terminating 0 15s
mwaa dbt-task-group-rpt-finance-qtr-total-sales-by-event-ru-1envlgl2 0/1 Completed 0 38s
mwaa dbt-task-group-rpt-finance-qtr-total-sales-by-event-ru-1envlgl2 0/1 Completed 0 40s
mwaa dbt-task-group-rpt-finance-qtr-total-sales-by-event-ru-1envlgl2 0/1 Completed 0 40s
mwaa dbt-task-group-base-public-venue-run-b9ke290z 0/1 Completed 0 41s
mwaa dbt-task-group-rpt-finance-qtr-total-sales-by-event-ru-1envlgl2 0/1 Terminating 0 41s
mwaa dbt-task-group-rpt-finance-qtr-total-sales-by-event-ru-1envlgl2 0/1 Terminating 0 41s
mwaa dbt-task-group-base-public-venue-run-b9ke290z 0/1 Completed 0 43s
mwaa dbt-task-group-base-public-venue-run-b9ke290z 0/1 Terminating 0 43s
mwaa dbt-task-group-base-public-venue-run-b9ke290z 0/1 Terminating 0 43s
mwaa dbt-task-group-rpt-finance-qtr-top-events-by-sales-run-cjf6gwpq 0/1 Pending 0 0s
mwaa dbt-task-group-rpt-finance-qtr-top-events-by-sales-run-cjf6gwpq 0/1 Pending 0 0s
mwaa dbt-task-group-rpt-finance-qtr-top-events-by-sales-run-cjf6gwpq 0/1 ContainerCreating 0 0s
mwaa dbt-task-group-rpt-finance-qtr-top-events-by-sales-run-cjf6gwpq 0/1 Completed 0 0s
mwaa dbt-task-group-rpt-finance-qtr-top-events-by-sales-run-cjf6gwpq 0/1 Completed 0 2s
mwaa dbt-task-group-rpt-finance-qtr-top-events-by-sales-run-cjf6gwpq 0/1 Terminating 0 2s
mwaa dbt-task-group-rpt-finance-qtr-top-events-by-sales-run-cjf6gwpq 0/1 Terminating 0 2s
Below is my dag python code.
from pathlib import Path
from airflow import DAG
from pendulum import datetime
from cosmos.config import ProjectConfig
from cosmos import (
ProfileConfig,
ExecutionConfig,
ExecutionMode,
DbtTaskGroup,
)
from cosmos.profiles import RedshiftUserPasswordProfileMapping
from airflow.operators.empty import EmptyOperator
PROJECT_DIR = Path("dags/dbt/dbtcicdnew/")
with DAG(
dag_id="dbtcicdnew",
start_date=datetime(2022, 11, 27),
doc_md=__doc__,
catchup=False,
) as dag:
run_models = DbtTaskGroup(
profile_config=ProfileConfig(
profile_name="dbtcicdnew",
target_name="dev",
profile_mapping=RedshiftUserPasswordProfileMapping(
conn_id="redshift_default",
profile_args={
"schema": "public",
},
),
),
project_config=ProjectConfig(PROJECT_DIR),
execution_config=ExecutionConfig(
execution_mode=ExecutionMode.KUBERNETES,
),
operator_args={
"do_xcom_push": False,
"project_dir":"/app",
"image": "139260835254.dkr.ecr.us-east-2.amazonaws.com/dbtcicdnew:1.0",
"get_logs": True,
"is_delete_operator_pod": True,
"namespace": "mwaa",
"config_file": "/usr/local/airflow/dags/kubeconfig",
"in_cluster": False,
"image_pull_policy": "Always",
},
)
e1 = EmptyOperator(task_id="pre_dbt")
e2 = EmptyOperator(task_id="post_dbt")
e1 >> run_models >> e2
Graph show all tasks successed.
The immediate completion of Kubernetes pods without executing the tasks as expected in your Airflow DAG could be due to several factors. Here's a structured approach to troubleshoot and potentially resolve the issue:
Check Environment Variables: Ensure all necessary environment variables for DBT and Airflow are correctly passed to the Kubernetes pods. Misconfiguration here could lead to tasks exiting prematurely.
Verify Command Execution: Review the command constructed for execution inside the pod. Incorrect DBT command flags or project directory paths can cause immediate task completion. Ensure the command and its arguments are correctly specified.
Review Pod Configuration: Examine how the pod is configured, including environment variables, namespace, and Docker image settings. Misconfigurations in these areas can prevent tasks from executing as expected.
Assess Kubernetes Cluster Resources: Insufficient resources in your EKS cluster could lead to pods being evicted or failing to start. Check Kubernetes events and logs for resource-related issues.
Airflow Configuration: Confirm your Airflow setup is correctly configured for Kubernetes execution, including the executor and Kubernetes connection settings.
Pod Lifecycle Events: Investigate how pod lifecycle events are handled, especially the logic around pod cleanup and termination. Custom logic here might inadvertently cause premature pod termination.
To further diagnose, delve into the Airflow task logs and Kubernetes pod logs and events for error messages or warnings that provide more insight. This targeted approach should help identify and resolve the root cause of the issue.
To continue the conversation, mention @dosu.
@ysjjovo what do those K8s pods logs say?
@ysjjovo what do those K8s pods logs say?
no logs generated in K8s pod,Here is the screenshot.
Cmd kubectl describe po -n mwaa dbt-task-group-base-public-users-run-rm109el1
output below.Seems no cmd generated by DAG.
➜ ~ kubectl describe po -n mwaa dbt-task-group-base-public-users-run-rm109el1
Name: dbt-task-group-base-public-users-run-rm109el1
Namespace: mwaa
Priority: 0
Node: ip-10-1-70-141.us-east-2.compute.internal/10.1.70.141
Start Time: Tue, 14 May 2024 10:37:39 +0800
Labels: airflow_kpo_in_cluster=False
airflow_version=2.8.1
already_checked=True
dag_id=dbtcicdnew
kubernetes_pod_operator=True
run_id=scheduled__2024-05-13T0000000000-10b1faeb7
task_id=dbt_task_group.base_public_users_run
try_number=1
Annotations: <none>
Status: Succeeded
IP: 10.1.70.150
IPs:
IP: 10.1.70.150
Containers:
base:
Container ID: containerd://cad08b5ba230bf388475379e0ba70ec5d06025eda4a4b1f2fe27e52525f1fd77
Image: 139260835254.dkr.ecr.us-east-2.amazonaws.com/dbtcicdnew:1.0
Image ID: 139260835254.dkr.ecr.us-east-2.amazonaws.com/dbtcicdnew@sha256:cd4aae808de68148f8de33e384354ad6195cee98d531f2f094e7f7ce49c8684c
Port: <none>
Host Port: <none>
State: Terminated
Reason: Completed
Exit Code: 0
Started: Tue, 14 May 2024 10:37:53 +0800
Finished: Tue, 14 May 2024 10:37:53 +0800
Ready: False
Restart Count: 0
Environment: <none>
Mounts:
/var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-9dntv (ro)
Conditions:
Type Status
PodReadyToStartContainers False
Initialized True
Ready False
ContainersReady False
PodScheduled True
Volumes:
kube-api-access-9dntv:
Type: Projected (a volume that contains injected data from multiple sources)
TokenExpirationSeconds: 3607
ConfigMapName: kube-root-ca.crt
ConfigMapOptional: <nil>
DownwardAPI: true
QoS Class: BestEffort
Node-Selectors: <none>
Tolerations: node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
Events: <none>
Airflow Graph show All success. one of airflow task 'base_public_users_run' logs
ip-10-1-66-19.us-east-2.compute.internal
*** Reading remote log from Cloudwatch log_group: airflow-lin-test-Task log_stream: dag_id=dbtcicdnew/run_id=manual__2024-05-14T02_34_50.996510+00_00/task_id=dbt_task_group.base_public_users_run/attempt=1.log.
[2024-05-14, 02:36:11 UTC] {{taskinstance.py:1956}} INFO - Dependencies all met for dep_context=non-requeueable deps ti=<TaskInstance: dbtcicdnew.dbt_task_group.base_public_users_run manual__2024-05-14T02:34:50.996510+00:00 [queued]>
[2024-05-14, 02:36:11 UTC] {{taskinstance.py:1956}} INFO - Dependencies all met for dep_context=requeueable deps ti=<TaskInstance: dbtcicdnew.dbt_task_group.base_public_users_run manual__2024-05-14T02:34:50.996510+00:00 [queued]>
[2024-05-14, 02:36:11 UTC] {{taskinstance.py:2170}} INFO - Starting attempt 1 of 1
[2024-05-14, 02:36:12 UTC] {{taskinstance.py:2191}} INFO - Executing <Task(DbtRunKubernetesOperator): dbt_task_group.base_public_users_run> on 2024-05-14 02:34:50.996510+00:00
[2024-05-14, 02:36:12 UTC] {{standard_task_runner.py:60}} INFO - Started process 371 to run task
[2024-05-14, 02:36:12 UTC] {{standard_task_runner.py:87}} INFO - Running: ['airflow', 'tasks', 'run', 'dbtcicdnew', 'dbt_task_group.base_public_users_run', 'manual__2024-05-14T02:34:50.996510+00:00', '--job-id', '10', '--raw', '--subdir', 'DAGS_FOLDER/dbt/dbtcicdnew/dags/eks.py', '--cfg-path', '/tmp/tmp37f9xj4k']
[2024-05-14, 02:36:12 UTC] {{standard_task_runner.py:88}} INFO - Job 10: Subtask dbt_task_group.base_public_users_run
[2024-05-14, 02:36:12 UTC] {{task_command.py:423}} INFO - Running <TaskInstance: dbtcicdnew.dbt_task_group.base_public_users_run manual__2024-05-14T02:34:50.996510+00:00 [running]> on host ip-10-1-66-19.us-east-2.compute.internal
[2024-05-14, 02:36:13 UTC] {{taskinstance.py:2480}} INFO - Exporting env vars: AIRFLOW_CTX_DAG_OWNER='airflow' AIRFLOW_CTX_DAG_ID='dbtcicdnew' AIRFLOW_CTX_TASK_ID='dbt_task_group.base_public_users_run' AIRFLOW_CTX_EXECUTION_DATE='2024-05-14T02:34:50.996510+00:00' AIRFLOW_CTX_TRY_NUMBER='1' AIRFLOW_CTX_DAG_RUN_ID='manual__2024-05-14T02:34:50.996510+00:00'
[2024-05-14, 02:36:13 UTC] {{pod.py:926}} INFO - Building pod dbt-task-group-base-public-users-run-hz8l2jbg with labels: {'dag_id': 'dbtcicdnew', 'task_id': 'dbt_task_group.base_public_users_run', 'run_id': 'manual__2024-05-14T023450.9965100000-c066a9c38', 'kubernetes_pod_operator': 'True', 'try_number': '1'}
[2024-05-14, 02:36:13 UTC] {{base.py:83}} INFO - Using connection ID 'kubernetes_default' for task execution.
[2024-05-14, 02:36:20 UTC] {{pod.py:490}} INFO - Found matching pod dbt-task-group-base-public-users-run-hz8l2jbg with labels {'airflow_kpo_in_cluster': 'False', 'airflow_version': '2.8.1', 'dag_id': 'dbtcicdnew', 'kubernetes_pod_operator': 'True', 'run_id': 'manual__2024-05-14T023450.9965100000-c066a9c38', 'task_id': 'dbt_task_group.base_public_users_run', 'try_number': '1'}
[2024-05-14, 02:36:20 UTC] {{pod.py:491}} INFO - `try_number` of task_instance: 1
[2024-05-14, 02:36:20 UTC] {{pod.py:492}} INFO - `try_number` of pod: 1
[2024-05-14, 02:36:20 UTC] {{pod_manager.py:359}} WARNING - Pod not yet started: dbt-task-group-base-public-users-run-hz8l2jbg
[2024-05-14, 02:36:21 UTC] {{pod_manager.py:359}} WARNING - Pod not yet started: dbt-task-group-base-public-users-run-hz8l2jbg
[2024-05-14, 02:36:22 UTC] {{pod_manager.py:359}} WARNING - Pod not yet started: dbt-task-group-base-public-users-run-hz8l2jbg
[2024-05-14, 02:36:23 UTC] {{pod_manager.py:460}} INFO - [base] None
[2024-05-14, 02:36:23 UTC] {{pod.py:788}} INFO - Skipping deleting pod: dbt-task-group-base-public-users-run-hz8l2jbg
[2024-05-14, 02:36:23 UTC] {{taskinstance.py:1138}} INFO - Marking task as SUCCESS. dag_id=dbtcicdnew, task_id=dbt_task_group.base_public_users_run, execution_date=20240514T023450, start_date=20240514T023611, end_date=20240514T023623
[2024-05-14, 02:36:24 UTC] {{local_task_job_runner.py:234}} INFO - Task exited with return code 0
[2024-05-14, 02:36:24 UTC] {{taskinstance.py:3280}} INFO - 0 downstream tasks scheduled from follow-on schedule check
@ysjjovo, thanks for the details! Could you confirm which version of apache-airflow-providers-cncf-kubernetes
and what is the Dockerfile
used to build the image 139260835254.dkr.ecr.us-east-2.amazonaws.com/dbtcicdnew:1.0
?
Also, could you test if your DAG works as expected in a local k8s cluster (e.g., with Kind, as described in these docs), it would be great for us to isolate the issue from a bug specific to MWAA, or a problem in the docker image / or in Cosmos.
@ysjjovo, thanks for the details! Could you confirm which version of
apache-airflow-providers-cncf-kubernetes
and what is theDockerfile
used to build the image139260835254.dkr.ecr.us-east-2.amazonaws.com/dbtcicdnew:1.0
?Also, could you test if your DAG works as expected in a local k8s cluster (e.g., with Kind, as described in these docs), it would be great for us to isolate the issue from a bug specific to MWAA, or a problem in the docker image / or in Cosmos.
I didn't specify apache-airflow-providers-cncf-kubernetes
version.
My requirements.txt is
astronomer-cosmos==1.3.2
apache-airflow-providers-cncf-kubernetes
But I can get version from MWAA(airflow) logs.version is 7.13.0
Here is the Dockerfile I used.
FROM python:3.9-slim
ENV AWS_DEFAULT_REGION "us-east-2"
RUN apt-get update
WORKDIR /app
COPY ./requirements.txt /app
RUN pip install -U pip \
&& pip --no-cache-dir install -r ./requirements.txt
COPY . .
COPY profiles.yml /root/.dbt/profiles.yml
RUN dbt deps
Here is the entire dbt project: https://github.com/ysjjovo/dbtcicdnew I also tried local aiflow with a public eks(k8s) could visit locally.Also got the same problem. Here is the issue I submitted:https://github.com/astronomer/cosmos-example/issues/7
Evironments: