Open CYarros10 opened 1 year ago
Thanks for opening your first issue here! Be sure to follow the issue template! If you are willing to raise PR to address this issue please do so, no need to wait for approval.
Agreed. dataflow_job_id
should be pushed to XCom as early as its known in not only BeanRunPythonPipelineOperator but also the Java and Go versions of the operator as well.
Can I take care of this issue?
@hubert-pietron Sure thing, all yours!
I need to unassigned myself, currently by the change of work I do not have time to look into the problem :/
job_id is stored in: job_id="{{task_instance.xcom_pull('start_python_job_async')['dataflow_job_id']}}",
It is not stored in: job_id="{{task_instance.xcom_pull('start_python_job_async')['dataflow_job_config']['job_id']}}",
Code Reference:
If you modify your code to retrieve dataflow job_id correctly, you will be able to retrieve it.
To illustrate how this is done, here is a sample code on how to retrieve Dataflow job id:
Have you tested this? the documentation is inconsistent and not reliable to solely go off of. for example, documentation you referenced states:
wait_for_python_job_dataflow_runner_async_done = DataflowJobStatusSensor(
task_id="wait-for-python-job-async-done",
job_id="{{task_instance.xcom_pull('start_python_job_dataflow_runner_async')['dataflow_job_id']}}",
expected_statuses={DataflowJobStatus.JOB_STATE_DONE},
project_id=GCP_PROJECT_ID,
location='us-central1',
)
and dataflow_job_id is not actually in the xcom
You are right. I was asked to take a look at this issue, and didn't have a chance to read the issue description in detail. I was only checking successful runs so I was able to get the dataflow job ids.
The dataflow job id is indeed only available after a Dataflow job finishes successfully.
It is not available when a Dataflow job starts or while it is running.
In a perfect world where no issue occurs, this is fine, but in the real world, when a Dataflow job gets cancelled, there is no job id to track the cancelled Dataflow job.
I can take on this issue
@zeotuan All yours!
Apache Airflow version
2.5.1
What happened
BeamRunPythonPipelineOperator does not push values to xcoms when the pipeline starts. But Dataflow Sensors work like this:
Since the only way to retrieve Dataflow Job ID from a BeamRunPythonPipelineOperator is through xcom, and BeamRunPythonPipelineOperator does not push this xcom until the pipeline ends, the Sensor can't "sense". It will only be able to read jobs that are done.
Error Message:
jinja2.exceptions.UndefinedError: 'None' has no attribute 'dataflow_job_config'
BeamRunPythonPipelineOperator Xcom (after completing):
What you think should happen instead
The dataflow Job ID should be pushed to xcom when/before the pipeline starts.
How to reproduce
Sample Code
Operating System
composer-2.1.5-airflow-2.4.3
Versions of Apache Airflow Providers
2.4.3
Deployment
Google Cloud Composer
Deployment details
No response
Anything else
Occurs every time
Are you willing to submit PR?
Code of Conduct