Closed iasoon closed 1 year ago
I am still seeing this issue with Airflow 2.2.2 and the "tutorial" DAG sleep task. https://airflow.apache.org/docs/apache-airflow/stable/_modules/airflow/example_dags/tutorial.html Here is the log
[2021-11-17 14:21:40,353] {taskinstance.py:1035} INFO - Dependencies all met for <TaskInstance: tutorial.sleep scheduled__2021-11-16T20:43:29.803380+00:00 [queued]>
[2021-11-17 14:21:40,361] {taskinstance.py:1035} INFO - Dependencies all met for <TaskInstance: tutorial.sleep scheduled__2021-11-16T20:43:29.803380+00:00 [queued]>
[2021-11-17 14:21:40,361] {taskinstance.py:1241} INFO -
--------------------------------------------------------------------------------
[2021-11-17 14:21:40,361] {taskinstance.py:1242} INFO - Starting attempt 2 of 4
[2021-11-17 14:21:40,361] {taskinstance.py:1243} INFO -
--------------------------------------------------------------------------------
[2021-11-17 14:21:40,370] {taskinstance.py:1262} INFO - Executing <Task(BashOperator): sleep> on 2021-11-16 20:43:29.803380+00:00
[2021-11-17 14:21:40,371] {base_task_runner.py:141} INFO - Running on host: ***
[2021-11-17 14:21:40,371] {base_task_runner.py:142} INFO - Running: ['airflow', 'tasks', 'run', 'tutorial', 'sleep', 'scheduled__2021-11-16T20:43:29.803380+00:00', '--job-id', '5650', '--raw', '--subdir', 'DAGS_FOLDER/tutorial.py', '--cfg-path', '/tmp/tmpqizvuf8t', '--error-file', '/tmp/tmpwyrb231g']
[2021-11-17 14:21:41,834] {base_task_runner.py:122} INFO - Job 5650: Subtask sleep [2021-11-17 14:21:41,834] {dagbag.py:500} INFO - Filling up the DagBag from /home/myuser/airflow/dags/tutorial.py
[2021-11-17 14:21:45,432] {local_task_job.py:206} WARNING - Recorded pid 17329 does not match the current pid 17330
[2021-11-17 14:21:45,435] {process_utils.py:100} INFO - Sending Signals.SIGTERM to GPID 17330
[2021-11-17 14:21:45,460] {base_task_runner.py:122} INFO - Job 5650: Subtask sleep Running <TaskInstance: tutorial.sleep scheduled__2021-11-16T20:43:29.803380+00:00 [running]> on host ***
[2021-11-17 14:21:45,462] {base_task_runner.py:122} INFO - Job 5650: Subtask sleep Traceback (most recent call last):
[2021-11-17 14:21:45,462] {base_task_runner.py:122} INFO - Job 5650: Subtask sleep File "/usr/local/bin/airflow", line 8, in <module>
[2021-11-17 14:21:45,462] {base_task_runner.py:122} INFO - Job 5650: Subtask sleep sys.exit(main())
[2021-11-17 14:21:45,462] {base_task_runner.py:122} INFO - Job 5650: Subtask sleep File "/usr/local/lib/python3.8/dist-packages/airflow/__main__.py", line 48, in main
[2021-11-17 14:21:45,462] {base_task_runner.py:122} INFO - Job 5650: Subtask sleep args.func(args)
[2021-11-17 14:21:45,462] {base_task_runner.py:122} INFO - Job 5650: Subtask sleep File "/usr/local/lib/python3.8/dist-packages/airflow/cli/cli_parser.py", line 48, in command
[2021-11-17 14:21:45,462] {base_task_runner.py:122} INFO - Job 5650: Subtask sleep return func(*args, **kwargs)
[2021-11-17 14:21:45,462] {base_task_runner.py:122} INFO - Job 5650: Subtask sleep File "/usr/local/lib/python3.8/dist-packages/airflow/utils/cli.py", line 92, in wrapper
[2021-11-17 14:21:45,462] {base_task_runner.py:122} INFO - Job 5650: Subtask sleep return f(*args, **kwargs)
[2021-11-17 14:21:45,463] {base_task_runner.py:122} INFO - Job 5650: Subtask sleep File "/usr/local/lib/python3.8/dist-packages/airflow/cli/commands/task_command.py", line 292, in task_run
[2021-11-17 14:21:45,463] {base_task_runner.py:122} INFO - Job 5650: Subtask sleep _run_task_by_selected_method(args, dag, ti)
[2021-11-17 14:21:45,463] {base_task_runner.py:122} INFO - Job 5650: Subtask sleep File "/usr/local/lib/python3.8/dist-packages/airflow/cli/commands/task_command.py", line 107, in _run_task_by_selected_method
[2021-11-17 14:21:45,463] {base_task_runner.py:122} INFO - Job 5650: Subtask sleep _run_raw_task(args, ti)
[2021-11-17 14:21:45,463] {base_task_runner.py:122} INFO - Job 5650: Subtask sleep File "/usr/local/lib/python3.8/dist-packages/airflow/cli/commands/task_command.py", line 180, in _run_raw_task
[2021-11-17 14:21:45,463] {base_task_runner.py:122} INFO - Job 5650: Subtask sleep ti._run_raw_task(
[2021-11-17 14:21:45,463] {base_task_runner.py:122} INFO - Job 5650: Subtask sleep File "/usr/local/lib/python3.8/dist-packages/airflow/utils/session.py", line 70, in wrapper
[2021-11-17 14:21:45,463] {base_task_runner.py:122} INFO - Job 5650: Subtask sleep return func(*args, session=session, **kwargs)
[2021-11-17 14:21:45,463] {base_task_runner.py:122} INFO - Job 5650: Subtask sleep File "/usr/local/lib/python3.8/dist-packages/airflow/models/taskinstance.py", line 1332, in _run_raw_task
[2021-11-17 14:21:45,463] {base_task_runner.py:122} INFO - Job 5650: Subtask sleep self._execute_task_with_callbacks(context)
[2021-11-17 14:21:45,463] {base_task_runner.py:122} INFO - Job 5650: Subtask sleep File "/usr/local/lib/python3.8/dist-packages/airflow/models/taskinstance.py", line 1458, in _execute_task_with_callbacks
[2021-11-17 14:21:45,463] {base_task_runner.py:122} INFO - Job 5650: Subtask sleep result = self._execute_task(context, self.task)
[2021-11-17 14:21:45,463] {base_task_runner.py:122} INFO - Job 5650: Subtask sleep File "/usr/local/lib/python3.8/dist-packages/airflow/models/taskinstance.py", line 1514, in _execute_task
[2021-11-17 14:21:45,463] {base_task_runner.py:122} INFO - Job 5650: Subtask sleep result = execute_callable(context=context)
[2021-11-17 14:21:45,463] {base_task_runner.py:122} INFO - Job 5650: Subtask sleep File "/usr/local/lib/python3.8/dist-packages/airflow/operators/bash.py", line 178, in execute
[2021-11-17 14:21:45,463] {base_task_runner.py:122} INFO - Job 5650: Subtask sleep result = self.subprocess_hook.run_command(
[2021-11-17 14:21:45,463] {base_task_runner.py:122} INFO - Job 5650: Subtask sleep File "/usr/local/lib/python3.8/dist-packages/airflow/hooks/subprocess.py", line 87, in run_command
[2021-11-17 14:21:45,463] {base_task_runner.py:122} INFO - Job 5650: Subtask sleep for raw_line in iter(self.sub_process.stdout.readline, b''):
[2021-11-17 14:21:45,463] {base_task_runner.py:122} INFO - Job 5650: Subtask sleep File "/usr/local/lib/python3.8/dist-packages/airflow/models/taskinstance.py", line 1413, in signal_handler
[2021-11-17 14:21:45,463] {base_task_runner.py:122} INFO - Job 5650: Subtask sleep raise AirflowException("Task received SIGTERM signal")
[2021-11-17 14:21:45,463] {base_task_runner.py:122} INFO - Job 5650: Subtask sleep airflow.exceptions.AirflowException: Task received SIGTERM signal
[2021-11-17 14:21:45,648] {process_utils.py:66} INFO - Process psutil.Process(pid=17334, status='terminated', started='14:21:41') (17334) terminated with exit code None
[2021-11-17 14:21:45,741] {process_utils.py:66} INFO - Process psutil.Process(pid=17330, status='terminated', exitcode=1, started='14:21:40') (17330) terminated with exit code 1
airflow info
Apache Airflow version | 2.2.2 executor | LocalExecutor task_logging_handler | airflow.utils.log.file_task_handler.FileTaskHandler sql_alchemy_conn | mysql://myuser@localhost/airflow dags_folder | /home/myuser/airflow/dags plugins_folder | /home/myuser/airflow/plugins base_log_folder | /home/myuser/airflow/logs remote_base_log_folder |
System info OS | Linux architecture | x86_64 uname | uname_result(system='Linux', node='*****', release='4.15.0-159-generic', version='#167~16.04.1-Ubuntu SMP Wed | Sep 22 14:59:34 UTC 2021', machine='x86_64', processor='x86_64') locale | ('en_US', 'UTF-8') python_version | 3.8.9 (default, Apr 3 2021, 01:02:10) [GCC 5.4.0 20160609] python_location | /usr/bin/python3.8
Tools info git | git version 2.7.4 ssh | OpenSSH_7.2p2 Ubuntu-4ubuntu2.10, OpenSSL 1.0.2g 1 Mar 2016 kubectl | NOT AVAILABLE gcloud | NOT AVAILABLE cloud_sql_proxy | NOT AVAILABLE mysql | mysql Ver 14.14 Distrib 5.7.33, for Linux (x86_64) using EditLine wrapper sqlite3 | NOT AVAILABLE psql | NOT AVAILABLE
Paths info airflow_home | /home/myuser/airflow system_path | /usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin: python_path | /usr/local/bin:/usr/lib/python38.zip:/usr/lib/python3.8:/usr/lib/python3.8/lib-dynload:/usr/local/lib/python3.8/dist-packages:/usr/lib/python3/di | st-packages:/home/myuser/airflow/dags:/home/myuser/airflow/config:/home/myuser/airflow/plugins airflow_on_path | True
Providers info apache-airflow-providers-ftp | 2.0.1 apache-airflow-providers-http | 2.0.1 apache-airflow-providers-imap | 2.0.1 apache-airflow-providers-mysql | 2.1.1 apache-airflow-providers-sqlite | 2.0.1
I found that I change aiflow.cfg default_impersonation = myuser. When I set it back to default default_impersonation= this issue is resolved for me in Airflow 2.2.2. I made this change after I found Logs suppressed when impersonation is enabled https://github.com/apache/airflow/issues/18041
Had a very similar problem, forcing run_as_user=None
when running my DAG fixed it. You may have a config inputting a value in this attribute, resulting in the stated behavior.
I am also seeing this same issue on 2.2.2, I have to use run_as_user in order to access user environment configurations for some DAGs. These DAGs can never complete successfully, instead they always result in the message Recorded pid <> does not match the current pid <>
.
Most "solutions" I'm finding recommend not using run_as_user and also clearing default_impersonation. unfortunately this isn't a solution, as we actually do need the functionality from those.
I am also seeing this same issue on 2.2.2, I have to use run_as_user in order to access user environment configurations for some DAGs. These DAGs can never complete successfully, instead they always result in the message
Recorded pid <> does not match the current pid <>
.Most "solutions" I'm finding recommend not using run_as_user and also clearing default_impersonation. unfortunately this isn't a solution, as we actually do need the functionality from those.
In other to use run_as_user, you must follow this guide http://airflow.apache.org/docs/apache-airflow/stable/security/workload.html#impersonation
Hi all,
I am experiencing this on 2.3.2 with LocalExecutor (4 schedulers), Postgres, and Ubuntu 22.04.
This is, however, running a clone of our staging environment of dags that run fine on 2.1.4 and Ubuntu 16.04. I'm also running on a much smaller and less powerful instance, and so it may be exacerbating race conditions.
I did some investigation into the process state, and when this error leads to a failure, this is what I see in process executions:
airflow scheduler
)recorded_pid
, which is assigned to be the taskinstance pid (ti.pid
) normally and the parent of the taskinstance pid (psutil.Process(ti.pid).ppid()
) when RUN_AS_USER is set. When failing, this consistently shows up as the worker (worker -- LocalExecutor
). This is a persistent and long term process. os.getpid()
), which is the airflow task supervisor. This (and everything below) is one of the short term task-specific processes.current_pid
can be different things, but always appears to be the child of the task supervisor / current pid. Often times this must be a fleeting process as I can barely catch a record of it when I'm trying to fetch a snapshot. Here are a couple that I have seen:
airflow tasks run [taskname]
airflow task su
, and the tasks are RUN_AS_USER, so likely related.I came to wonder, since this error happens because (a) the final recorded_pid
is not None and (B) recorded_pid
!= current_pid
- it doesn't make much sense to ever be comparing against the Task Instance pid since that's hanging around for a very long time and the heatbeat function appears to be identifying when the current task runner is zombified or missing.
As I've investigated further, I've found on task failures for RUN_AS_USER tasks in which this fails, the ti.pid
is almost invariably None
, which means the recorded_pid
comes in as psutil.Process(None).ppid()
, which will be the parent of the current process. I am currently under the impression that this was not intended - and that the error condition should only be tested when ti.pid is not None
, instead of recorded_pid is not None
.
I'm testing this right now and it seems to work - and if that seems to hold up I'll put in a PR.
This does seem to be working consistently for LocalExecutor - I haven't checked Celery or Kubernetes.
It will take me a little while to set up the dev environment and do the testing before submitting a PR, but feel free to give it a whirl, I have a tentative branch set up here: https://github.com/krcrouse/airflow/tree/fix-pid-check
I see the same issue happening on 2.4.3 Multiple dags started getting terminated within range of 5 minutes after running fine for weeks due to this error. here is my config detail.
Python 3.10 (setup through miniconda) Airflow 2.4.3 MetadataDB: MySql 8 Executor -> LocalExecutor run_as_user is not set. The config is empty.
OS details. NAME="Ubuntu" VERSION="18.04.6 LTS (Bionic Beaver)" ID=ubuntu ID_LIKE=debian PRETTY_NAME="Ubuntu 18.04.6 LTS" VERSION_ID="18.04" HOME_URL="https://www.ubuntu.com/" SUPPORT_URL="https://help.ubuntu.com/" BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/" PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy" VERSION_CODENAME=bionic UBUNTU_CODENAME=bionic
Seen this in the Tasks logs.
[2023-05-04, 02:28:21 EDT] {local_task_job.py:205} WARNING - Recorded pid 3738156 does not match the current pid 3634136
[2023-05-04, 02:28:21 EDT] {process_utils.py:129} INFO - Sending Signals.SIGTERM to group 3634136. PIDs of all processes in the group: [3634136]
[2023-05-04, 02:28:21 EDT] {process_utils.py:84} INFO - Sending the signal Signals.SIGTERM to group 3634136
[2023-05-04, 02:28:21 EDT] {taskinstance.py:1562} ERROR - Received SIGTERM. Terminating subprocesses.
Airflow Logs
[2023-05-04 02:28:08,454] {manager.py:288} WARNING - DagFileProcessorManager (PID=3732979) exited with exit code 1 - re-launching
[2023-05-04 02:28:08,469] {manager.py:163} INFO - Launched DagFileProcessorManager with pid: 3737637
[2023-05-04 02:28:08,480] {settings.py:58} INFO - Configured default timezone Timezone('UTC')
[2023-05-04 02:28:08,541] {scheduler_job.py:1380} INFO - Resetting orphaned tasks for active dag runs
[2023-05-04 02:28:08,547] {scheduler_job.py:1403} INFO - Marked 1 SchedulerJob instances as failed
[2023-05-04 02:28:08,767] {scheduler_job.py:1444} INFO - Reset the following 38 orphaned TaskInstances:
<TaskInstance: *** scheduled__2023-05-04T01:05:00+00:00 [queued]>
<TaskInstance: *** scheduled__2023-04-30T20:00:00+00:00 [running]>
<TaskInstance: *** scheduled__2023-05-02T20:00:00+00:00 [running]>
<TaskInstance: *** scheduled__2023-05-03T02:00:00+00:00 [running]>
<TaskInstance: *** scheduled__2023-04-16T00:00:00+00:00 [running]>
<TaskInstance: *** scheduled__2023-04-29T08:00:00+00:00 [running]>
<TaskInstance: *** scheduled__2023-04-30T08:00:00+00:00 [running]>
<TaskInstance: *** scheduled__2023-05-04T05:00:00+00:00 [running]>
<TaskInstance: *** scheduled__2023-05-04T00:00:00+00:00 [running]>
<TaskInstance: *** scheduled__2023-04-23T00:00:00+00:00 [running]>
<TaskInstance: *** scheduled__2023-05-04T04:05:00+00:00 [running]>
<TaskInstance: *** scheduled__2023-05-04T04:30:00+00:00 [running]>
<TaskInstance: *** scheduled__2023-05-03T20:05:00+00:00 [running]>
<TaskInstance: *** scheduled__2023-05-03T01:03:00+00:00 [running]>
<TaskInstance: *** scheduled__2023-05-04T00:00:00+00:00 [running]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [running]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [running]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [running]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [running]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [running]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [running]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [running]>
<TaskInstance: *** scheduled__2023-05-03T19:36:00+00:00 [running]>
<TaskInstance: *** scheduled__2023-05-04T05:00:00+00:00 [running]>
<TaskInstance: *** scheduled__2023-05-04T06:00:00+00:00 [running]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [running]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [running]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [running]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [running]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [running]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [running]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [running]>
<TaskInstance: *** scheduled__2023-04-30T19:30:00+00:00 [running]>
<TaskInstance: *** scheduled__2023-05-04T05:05:00+00:00 [running]>
<TaskInstance: *** scheduled__2023-05-04T06:20:00+00:00 [running]>
<TaskInstance: *** scheduled__2023-05-04T00:05:00+00:00 [running]>
<TaskInstance: *** scheduled__2023-05-04T06:20:00+00:00 [running]>
<TaskInstance: *** scheduled__2023-05-03T05:00:00+00:00 [running]>
[2023-05-04 02:28:13,780] {scheduler_job.py:346} INFO - 23 tasks up for execution:
<TaskInstance: *** scheduled__2023-05-04T00:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-04-30T08:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-04-30T20:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T01:03:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T02:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T20:05:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-04T04:05:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-04T05:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-04T06:20:00+00:00 [scheduled]>
[2023-05-04 02:28:13,780] {scheduler_job.py:411} INFO - DAG *** has 0/50 running and queued tasks
[2023-05-04 02:28:13,780] {scheduler_job.py:411} INFO - DAG *** has 0/50 running and queued tasks
[2023-05-04 02:28:13,781] {scheduler_job.py:411} INFO - DAG *** has 0/50 running and queued tasks
[2023-05-04 02:28:13,781] {scheduler_job.py:411} INFO - DAG *** has 0/50 running and queued tasks
[2023-05-04 02:28:13,781] {scheduler_job.py:411} INFO - DAG *** has 0/50 running and queued tasks
[2023-05-04 02:28:13,781] {scheduler_job.py:411} INFO - DAG *** has 1/50 running and queued tasks
[2023-05-04 02:28:13,781] {scheduler_job.py:411} INFO - DAG *** has 2/50 running and queued tasks
[2023-05-04 02:28:13,781] {scheduler_job.py:411} INFO - DAG *** has 3/50 running and queued tasks
[2023-05-04 02:28:13,781] {scheduler_job.py:411} INFO - DAG *** has 4/50 running and queued tasks
[2023-05-04 02:28:13,781] {scheduler_job.py:411} INFO - DAG *** has 0/50 running and queued tasks
[2023-05-04 02:28:13,781] {scheduler_job.py:411} INFO - DAG *** has 1/50 running and queued tasks
[2023-05-04 02:28:13,782] {scheduler_job.py:411} INFO - DAG *** has 2/50 running and queued tasks
[2023-05-04 02:28:13,782] {scheduler_job.py:411} INFO - DAG *** has 3/50 running and queued tasks
[2023-05-04 02:28:13,782] {scheduler_job.py:411} INFO - DAG *** has 4/50 running and queued tasks
[2023-05-04 02:28:13,782] {scheduler_job.py:411} INFO - DAG *** has 5/50 running and queued tasks
[2023-05-04 02:28:13,782] {scheduler_job.py:411} INFO - DAG *** has 6/50 running and queued tasks
[2023-05-04 02:28:13,782] {scheduler_job.py:411} INFO - DAG *** has 5/50 running and queued tasks
[2023-05-04 02:28:13,782] {scheduler_job.py:411} INFO - DAG *** has 6/50 running and queued tasks
[2023-05-04 02:28:13,783] {scheduler_job.py:411} INFO - DAG *** has 0/50 running and queued tasks
[2023-05-04 02:28:13,783] {scheduler_job.py:411} INFO - DAG *** has 0/50 running and queued tasks
[2023-05-04 02:28:13,783] {scheduler_job.py:411} INFO - DAG *** has 0/50 running and queued tasks
[2023-05-04 02:28:13,783] {scheduler_job.py:411} INFO - DAG *** has 0/50 running and queued tasks
[2023-05-04 02:28:13,783] {scheduler_job.py:411} INFO - DAG *** has 0/50 running and queued tasks
[2023-05-04 02:28:13,783] {scheduler_job.py:497} INFO - Setting the following tasks to queued state:
<TaskInstance: *** scheduled__2023-05-04T00:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-04-30T08:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-04-30T20:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T01:03:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T02:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T20:05:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-04T04:05:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-04T05:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-04T06:20:00+00:00 [scheduled]>
[2023-05-04 02:28:13,960] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, try_number=2, map_index=-1) to executor with priority 7 and queue default
[2023-05-04 02:28:13,961] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-04T00:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:13,961] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-04-30T08:00:00+00:00', try_number=2, map_index=-1) to executor with priority 4 and queue default
[2023-05-04 02:28:13,961] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-04-30T08:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:13,962] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-04-30T20:00:00+00:00', try_number=5, map_index=-1) to executor with priority 3 and queue default
[2023-05-04 02:28:13,962] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-04-30T20:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:13,962] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T01:03:00+00:00', try_number=2, map_index=-1) to executor with priority 2 and queue default
[2023-05-04 02:28:13,962] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T01:03:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:13,963] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T06:00:00+00:00', try_number=2, map_index=-1) to executor with priority 2 and queue default
[2023-05-04 02:28:13,963] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:13,963] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T06:00:00+00:00', try_number=2, map_index=-1) to executor with priority 2 and queue default
[2023-05-04 02:28:13,963] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:13,964] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T06:00:00+00:00', try_number=2, map_index=-1) to executor with priority 2 and queue default
[2023-05-04 02:28:13,964] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:13,964] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T06:00:00+00:00', try_number=2, map_index=-1) to executor with priority 2 and queue default
[2023-05-04 02:28:13,964] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:13,964] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T06:00:00+00:00', try_number=2, map_index=-1) to executor with priority 2 and queue default
[2023-05-04 02:28:13,965] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:13,965] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T06:00:00+00:00', try_number=2, map_index=-1) to executor with priority 2 and queue default
[2023-05-04 02:28:13,965] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:13,965] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T06:00:00+00:00', try_number=2, map_index=-1) to executor with priority 2 and queue default
[2023-05-04 02:28:13,966] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:13,966] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T06:00:00+00:00', try_number=2, map_index=-1) to executor with priority 2 and queue default
[2023-05-04 02:28:13,966] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:13,966] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T06:00:00+00:00', try_number=2, map_index=-1) to executor with priority 2 and queue default
[2023-05-04 02:28:13,967] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:13,967] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T06:00:00+00:00', try_number=2, map_index=-1) to executor with priority 2 and queue default
[2023-05-04 02:28:13,967] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:13,967] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T06:00:00+00:00', try_number=2, map_index=-1) to executor with priority 2 and queue default
[2023-05-04 02:28:13,967] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:13,968] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T06:00:00+00:00', try_number=2, map_index=-1) to executor with priority 2 and queue default
[2023-05-04 02:28:13,968] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:13,968] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T06:00:00+00:00', try_number=2, map_index=-1) to executor with priority 2 and queue default
[2023-05-04 02:28:13,968] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:13,969] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T06:00:00+00:00', try_number=2, map_index=-1) to executor with priority 2 and queue default
[2023-05-04 02:28:13,969] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:13,969] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T02:00:00+00:00', try_number=2, map_index=-1) to executor with priority 1 and queue default
[2023-05-04 02:28:13,969] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T02:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:13,970] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T20:05:00+00:00', try_number=2, map_index=-1) to executor with priority 1 and queue default
[2023-05-04 02:28:13,970] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T20:05:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:13,970] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-04T04:05:00+00:00', try_number=2, map_index=-1) to executor with priority 1 and queue default
[2023-05-04 02:28:13,970] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-04T04:05:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:13,970] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-04T05:00:00+00:00', try_number=2, map_index=-1) to executor with priority 1 and queue default
[2023-05-04 02:28:13,971] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-04T05:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:13,971] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-04T06:20:00+00:00', try_number=2, map_index=-1) to executor with priority 1 and queue default
[2023-05-04 02:28:13,971] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-04T06:20:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:13,999] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-04T00:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:13,999] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-04-30T08:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:13,999] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-04-30T20:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:14,000] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T01:03:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:14,000] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:14,000] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:14,000] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:14,001] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:14,001] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:14,002] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:14,002] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:14,002] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:14,002] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:14,003] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:14,003] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:14,004] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:14,004] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:14,004] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T06:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:14,004] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T02:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:14,005] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T20:05:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:14,005] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-04T04:05:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:14,005] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-04T05:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:14,010] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-04T06:20:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:14,099] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:14,108] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:14,110] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:14,127] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:14,134] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:14,137] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:14,152] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:14,194] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:14,237] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:14,247] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:14,248] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:14,250] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:14,251] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:14,253] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:14,255] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:14,257] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:14,257] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:14,275] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:14,293] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:14,304] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:14,318] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:14,324] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:14,332] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:14,517] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:14,591] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:14,595] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:14,597] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:14,682] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:14,733] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:14,757] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:14,771] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:14,782] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:14,790] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:14,822] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:14,840] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:14,844] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:14,845] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:14,847] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:14,882] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:14,893] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:14,902] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:14,918] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:14,935] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:14,937] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:14,939] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:14,952] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:14,973] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:14,979] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:15,015] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:15,024] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:15,040] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:15,044] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:15,054] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:15,075] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:15,079] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:15,090] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:15,095] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:15,102] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:15,136] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [queued]> on host ***
[2023-05-04 02:28:15,139] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:15,183] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [queued]> on host ***
[2023-05-04 02:28:15,190] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:15,225] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:15,228] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-05-03T02:00:00+00:00 [queued]> on host ***
[2023-05-04 02:28:15,240] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:15,248] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [queued]> on host ***
[2023-05-04 02:28:15,259] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:15,282] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [queued]> on host ***
[2023-05-04 02:28:15,315] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [queued]> on host ***
[2023-05-04 02:28:15,353] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [queued]> on host ***
[2023-05-04 02:28:15,384] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [queued]> on host ***
[2023-05-04 02:28:15,412] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [queued]> on host ***
[2023-05-04 02:28:15,444] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-05-04T04:05:00+00:00 [queued]> on host ***
[2023-05-04 02:28:15,445] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [queued]> on host ***
[2023-05-04 02:28:15,528] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-05-04T05:00:00+00:00 [queued]> on host ***
[2023-05-04 02:28:15,568] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [queued]> on host ***
[2023-05-04 02:28:15,583] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [queued]> on host ***
[2023-05-04 02:28:15,669] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-04-30T20:00:00+00:00 [queued]> on host ***
[2023-05-04 02:28:15,669] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-05-04T06:20:00+00:00 [queued]> on host ***
[2023-05-04 02:28:15,707] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [queued]> on host ***
[2023-05-04 02:28:15,710] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [queued]> on host ***
[2023-05-04 02:28:15,779] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-05-03T01:03:00+00:00 [queued]> on host ***
[2023-05-04 02:28:15,779] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-05-03T06:00:00+00:00 [queued]> on host ***
[2023-05-04 02:28:15,796] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-05-03T20:05:00+00:00 [queued]> on host ***
[2023-05-04 02:28:15,861] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-05-04T00:00:00+00:00 [queued]> on host ***
[2023-05-04 02:28:15,952] {credentials.py:1255} INFO - Found credentials in shared credentials file: ~/.aws/credentials
[2023-05-04 02:28:15,990] {scheduler_job.py:346} INFO - 9 tasks up for execution:
<TaskInstance: *** scheduled__2023-04-29T08:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-02T20:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-04T05:05:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T05:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-04T00:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-04-30T19:30:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T19:36:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-04T00:05:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-04T05:00:00+00:00 [scheduled]>
[2023-05-04 02:28:15,995] {scheduler_job.py:411} INFO - DAG *** has 1/50 running and queued tasks
[2023-05-04 02:28:15,996] {scheduler_job.py:411} INFO - DAG *** has 1/50 running and queued tasks
[2023-05-04 02:28:15,996] {scheduler_job.py:411} INFO - DAG *** has 0/50 running and queued tasks
[2023-05-04 02:28:15,996] {scheduler_job.py:411} INFO - DAG *** has 0/50 running and queued tasks
[2023-05-04 02:28:15,996] {scheduler_job.py:411} INFO - DAG *** has 0/50 running and queued tasks
[2023-05-04 02:28:15,996] {scheduler_job.py:411} INFO - DAG *** has 0/50 running and queued tasks
[2023-05-04 02:28:15,996] {scheduler_job.py:411} INFO - DAG *** has 0/50 running and queued tasks
[2023-05-04 02:28:15,996] {scheduler_job.py:411} INFO - DAG *** has 0/50 running and queued tasks
[2023-05-04 02:28:15,996] {scheduler_job.py:411} INFO - DAG *** has 0/50 running and queued tasks
[2023-05-04 02:28:15,997] {scheduler_job.py:497} INFO - Setting the following tasks to queued state:
<TaskInstance: *** scheduled__2023-04-29T08:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-02T20:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-04T05:05:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T05:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-04T00:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-04-30T19:30:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-03T19:36:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-04T00:05:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-04T05:00:00+00:00 [scheduled]>
[2023-05-04 02:28:16,014] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-04-29T08:00:00+00:00', try_number=2, map_index=-1) to executor with priority 4 and queue default
[2023-05-04 02:28:16,015] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-04-29T08:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:16,015] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-02T20:00:00+00:00', try_number=2, map_index=-1) to executor with priority 3 and queue default
[2023-05-04 02:28:16,016] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-02T20:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:16,016] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-04T05:05:00+00:00', try_number=2, map_index=-1) to executor with priority 3 and queue default
[2023-05-04 02:28:16,016] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-04T05:05:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:16,016] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T05:00:00+00:00', try_number=2, map_index=-1) to executor with priority 2 and queue default
[2023-05-04 02:28:16,016] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T05:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:16,016] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-04T00:00:00+00:00', try_number=2, map_index=-1) to executor with priority 2 and queue default
[2023-05-04 02:28:16,016] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-04T00:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:16,017] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-04-30T19:30:00+00:00', try_number=2, map_index=-1) to executor with priority 1 and queue default
[2023-05-04 02:28:16,017] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-04-30T19:30:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:16,017] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-03T19:36:00+00:00', try_number=2, map_index=-1) to executor with priority 1 and queue default
[2023-05-04 02:28:16,017] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T19:36:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:16,017] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-04T00:05:00+00:00', try_number=2, map_index=-1) to executor with priority 1 and queue default
[2023-05-04 02:28:16,017] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-04T00:05:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:16,018] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-04T05:00:00+00:00', try_number=2, map_index=-1) to executor with priority 1 and queue default
[2023-05-04 02:28:16,018] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-04T05:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:16,051] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-04-29T08:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:16,051] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-02T20:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:16,051] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-04T05:05:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:16,051] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T05:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:16,052] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-04T00:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:16,052] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-04-30T19:30:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:16,052] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-03T19:36:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:16,052] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-04T00:05:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:16,053] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-04T05:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:16,256] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:16,340] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:16,348] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:16,355] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:16,399] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:16,410] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:16,434] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:16,434] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:16,446] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:16,589] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:16,703] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:16,828] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:16,941] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:17,047] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:17,246] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:17,257] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:17,284] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:17,306] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:17,320] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:17,423] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:17,450] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-05-04T00:00:00+00:00 [queued]> on host ***
[2023-05-04 02:28:17,455] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:17,477] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:17,682] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-05-02T20:00:00+00:00 [queued]> on host ***
[2023-05-04 02:28:17,894] {credentials.py:1255} INFO - Found credentials in shared credentials file: ~/.aws/credentials
[2023-05-04 02:28:17,913] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-05-04T05:00:00+00:00 [queued]> on host ***
[2023-05-04 02:28:18,055] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-05-03T19:36:00+00:00 [queued]> on host ***
[2023-05-04 02:28:18,196] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-05-04T00:05:00+00:00 [queued]> on host ***
[2023-05-04 02:28:18,288] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-04-30T19:30:00+00:00 [queued]> on host ***
[2023-05-04 02:28:18,317] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-05-03T05:00:00+00:00 [queued]> on host ***
[2023-05-04 02:28:18,734] {scheduler_job.py:346} INFO - 6 tasks up for execution:
<TaskInstance: *** scheduled__2023-05-04T01:05:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-04-16T00:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-04-23T00:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-04T04:30:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-04T06:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-04T06:20:00+00:00 [scheduled]>
[2023-05-04 02:28:18,735] {scheduler_job.py:411} INFO - DAG *** has 1/50 running and queued tasks
[2023-05-04 02:28:18,735] {scheduler_job.py:411} INFO - DAG *** has 0/50 running and queued tasks
[2023-05-04 02:28:18,735] {scheduler_job.py:411} INFO - DAG *** has 0/50 running and queued tasks
[2023-05-04 02:28:18,735] {scheduler_job.py:411} INFO - DAG *** has 0/50 running and queued tasks
[2023-05-04 02:28:18,735] {scheduler_job.py:411} INFO - DAG *** has 0/50 running and queued tasks
[2023-05-04 02:28:18,735] {scheduler_job.py:411} INFO - DAG *** has 0/50 running and queued tasks
[2023-05-04 02:28:18,735] {scheduler_job.py:497} INFO - Setting the following tasks to queued state:
<TaskInstance: *** scheduled__2023-05-04T01:05:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-04-16T00:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-04-23T00:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-04T04:30:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-04T06:00:00+00:00 [scheduled]>
<TaskInstance: *** scheduled__2023-05-04T06:20:00+00:00 [scheduled]>
[2023-05-04 02:28:18,812] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-04T01:05:00+00:00', try_number=1, map_index=-1) to executor with priority 4 and queue default
[2023-05-04 02:28:18,812] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-04T01:05:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:18,813] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-04-16T00:00:00+00:00', try_number=2, map_index=-1) to executor with priority 1 and queue default
[2023-05-04 02:28:18,813] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-04-16T00:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:18,813] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-04-23T00:00:00+00:00', try_number=2, map_index=-1) to executor with priority 1 and queue default
[2023-05-04 02:28:18,813] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-04-23T00:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:18,813] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-04T04:30:00+00:00', try_number=2, map_index=-1) to executor with priority 1 and queue default
[2023-05-04 02:28:18,813] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-04T04:30:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:18,813] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-04T06:00:00+00:00', try_number=2, map_index=-1) to executor with priority 1 and queue default
[2023-05-04 02:28:18,814] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-04T06:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:18,814] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-04T06:20:00+00:00', try_number=2, map_index=-1) to executor with priority 1 and queue default
[2023-05-04 02:28:18,814] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-04T06:20:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:18,861] {credentials.py:1255} INFO - Found credentials in shared credentials file: ~/.aws/credentials
[2023-05-04 02:28:18,871] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-04T01:05:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:18,872] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-04-16T00:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:18,872] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-04-23T00:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:18,872] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-04T04:30:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:18,873] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-04T06:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:18,873] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-04T06:20:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:18,983] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:19,088] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:19,110] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:19,132] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:19,133] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:19,135] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:19,160] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:19,171] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:19,212] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:19,286] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:19,287] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:19,365] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:19,431] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:19,453] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:19,492] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:19,499] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:19,531] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:19,540] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:19,580] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:19,581] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:19,608] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:19,648] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:19,664] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:19,670] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:19,680] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:19,723] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:19,753] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:19,765] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:19,765] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:19,775] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:19,786] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-04-23T00:00:00+00:00 [queued]> on host ***
[2023-05-04 02:28:19,794] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:19,803] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:19,822] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:19,830] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:19,874] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-04-16T00:00:00+00:00 [queued]> on host ***
[2023-05-04 02:28:19,889] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:19,963] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:19,998] {credentials.py:1255} INFO - Found credentials in shared credentials file: ~/.aws/credentials
[2023-05-04 02:28:20,029] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:20,079] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-05-04T04:30:00+00:00 [queued]> on host ***
[2023-05-04 02:28:20,085] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:20,128] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:20,179] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-05-04T06:00:00+00:00 [queued]> on host ***
[2023-05-04 02:28:20,204] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:20,241] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-05-04T06:20:00+00:00 [queued]> on host ***
[2023-05-04 02:28:20,240] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:20,298] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:20,300] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:20,327] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:20,352] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:20,396] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:20,406] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:20,441] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:20,460] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:20,489] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:20,491] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:20,545] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:20,545] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:20,598] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:20,607] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:20,660] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:20,666] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:20,696] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:20,737] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:20,750] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:20,783] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:20,829] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:20,865] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:20,928] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:20,955] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:20,989] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:21,014] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:21,064] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:21,091] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:21,107] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-05-04T05:05:00+00:00 [queued]> on host ***
[2023-05-04 02:28:21,150] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:21,287] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:21,482] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:21,643] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:21,844] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:21,979] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:22,140] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:22,324] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:22,492] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:22,646] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:22,790] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:22,881] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:23,260] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-05-04T01:05:00+00:00 [queued]> on host ***
[2023-05-04 02:28:23,499] {local_executor.py:130} ERROR - Failed to execute task PID of job runner does not match.
Traceback (most recent call last):
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/executors/local_executor.py", line 126, in _execute_work_in_fork
args.func(args)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/cli_parser.py", line 52, in command
return func(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/cli.py", line 103, in wrapper
return f(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 382, in task_run
_run_task_by_selected_method(args, dag, ti)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 189, in _run_task_by_selected_method
_run_task_by_local_task_job(args, ti)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 247, in _run_task_by_local_task_job
run_job.run()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", line 247, in run
self._execute()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py", line 135, in _execute
self.heartbeat()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", line 228, in heartbeat
self.heartbeat_callback(session=session)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/session.py", line 72, in wrapper
return func(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py", line 208, in heartbeat_callback
raise AirflowException("PID of job runner does not match")
airflow.exceptions.AirflowException: PID of job runner does not match
[2023-05-04 02:28:23,629] {scheduler_job.py:588} INFO - Executor reports execution of *** run_id=scheduled__2023-05-04T00:05:00+00:00 exited with status failed for try_number 1
[2023-05-04 02:28:23,627] {local_executor.py:130} ERROR - Failed to execute task PID of job runner does not match.
Traceback (most recent call last):
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/executors/local_executor.py", line 126, in _execute_work_in_fork
args.func(args)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/cli_parser.py", line 52, in command
return func(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/cli.py", line 103, in wrapper
return f(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 382, in task_run
_run_task_by_selected_method(args, dag, ti)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 189, in _run_task_by_selected_method
_run_task_by_local_task_job(args, ti)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 247, in _run_task_by_local_task_job
run_job.run()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", line 247, in run
self._execute()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py", line 135, in _execute
self.heartbeat()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", line 228, in heartbeat
self.heartbeat_callback(session=session)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/session.py", line 72, in wrapper
return func(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py", line 208, in heartbeat_callback
raise AirflowException("PID of job runner does not match")
airflow.exceptions.AirflowException: PID of job runner does not match
[2023-05-04 02:28:23,640] {scheduler_job.py:631} INFO - TaskInstance Finished: dag_id=*** task_id=***, run_id=scheduled__2023-05-04T00:05:00+00:00, map_index=-1, run_start_date=2023-05-04 06:28:18.756761+00:00, run_end_date=2023-05-04 06:28:22.505894+00:00, run_duration=3.74913, state=up_for_retry, executor_state=failed, try_number=1, max_tries=3, job_id=277262, pool=default_pool, queue=default, priority_weight=1, operator=SSHOperator, queued_dttm=2023-05-04 06:28:15.998301+00:00, queued_by_job_id=233035, pid=3738404
[2023-05-04 02:28:23,664] {local_executor.py:130} ERROR - Failed to execute task PID of job runner does not match.
Traceback (most recent call last):
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/executors/local_executor.py", line 126, in _execute_work_in_fork
args.func(args)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/cli_parser.py", line 52, in command
return func(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/cli.py", line 103, in wrapper
return f(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 382, in task_run
_run_task_by_selected_method(args, dag, ti)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 189, in _run_task_by_selected_method
_run_task_by_local_task_job(args, ti)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 247, in _run_task_by_local_task_job
run_job.run()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", line 247, in run
self._execute()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py", line 135, in _execute
self.heartbeat()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", line 228, in heartbeat
self.heartbeat_callback(session=session)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/session.py", line 72, in wrapper
return func(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py", line 208, in heartbeat_callback
raise AirflowException("PID of job runner does not match")
airflow.exceptions.AirflowException: PID of job runner does not match
[2023-05-04 02:28:23,675] {local_executor.py:130} ERROR - Failed to execute task PID of job runner does not match.
Traceback (most recent call last):
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/executors/local_executor.py", line 126, in _execute_work_in_fork
args.func(args)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/cli_parser.py", line 52, in command
return func(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/cli.py", line 103, in wrapper
return f(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 382, in task_run
_run_task_by_selected_method(args, dag, ti)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 189, in _run_task_by_selected_method
_run_task_by_local_task_job(args, ti)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 247, in _run_task_by_local_task_job
run_job.run()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", line 247, in run
self._execute()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py", line 135, in _execute
self.heartbeat()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", line 228, in heartbeat
self.heartbeat_callback(session=session)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/session.py", line 72, in wrapper
return func(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py", line 208, in heartbeat_callback
raise AirflowException("PID of job runner does not match")
airflow.exceptions.AirflowException: PID of job runner does not match
[2023-05-04 02:28:23,696] {local_executor.py:130} ERROR - Failed to execute task PID of job runner does not match.
Traceback (most recent call last):
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/executors/local_executor.py", line 126, in _execute_work_in_fork
args.func(args)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/cli_parser.py", line 52, in command
return func(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/cli.py", line 103, in wrapper
return f(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 382, in task_run
_run_task_by_selected_method(args, dag, ti)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 189, in _run_task_by_selected_method
_run_task_by_local_task_job(args, ti)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 247, in _run_task_by_local_task_job
run_job.run()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", line 247, in run
self._execute()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py", line 135, in _execute
self.heartbeat()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", line 228, in heartbeat
self.heartbeat_callback(session=session)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/session.py", line 72, in wrapper
return func(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py", line 208, in heartbeat_callback
raise AirflowException("PID of job runner does not match")
airflow.exceptions.AirflowException: PID of job runner does not match
[2023-05-04 02:28:23,737] {local_executor.py:130} ERROR - Failed to execute task PID of job runner does not match.
Traceback (most recent call last):
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/executors/local_executor.py", line 126, in _execute_work_in_fork
args.func(args)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/cli_parser.py", line 52, in command
return func(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/cli.py", line 103, in wrapper
return f(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 382, in task_run
_run_task_by_selected_method(args, dag, ti)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 189, in _run_task_by_selected_method
_run_task_by_local_task_job(args, ti)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 247, in _run_task_by_local_task_job
run_job.run()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", line 247, in run
self._execute()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py", line 135, in _execute
self.heartbeat()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", line 228, in heartbeat
self.heartbeat_callback(session=session)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/session.py", line 72, in wrapper
return func(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py", line 208, in heartbeat_callback
raise AirflowException("PID of job runner does not match")
airflow.exceptions.AirflowException: PID of job runner does not match
[2023-05-04 02:28:23,791] {local_executor.py:130} ERROR - Failed to execute task PID of job runner does not match.
Traceback (most recent call last):
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/executors/local_executor.py", line 126, in _execute_work_in_fork
args.func(args)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/cli_parser.py", line 52, in command
return func(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/cli.py", line 103, in wrapper
return f(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 382, in task_run
_run_task_by_selected_method(args, dag, ti)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 189, in _run_task_by_selected_method
_run_task_by_local_task_job(args, ti)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 247, in _run_task_by_local_task_job
run_job.run()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", line 247, in run
self._execute()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py", line 135, in _execute
self.heartbeat()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", line 228, in heartbeat
self.heartbeat_callback(session=session)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/session.py", line 72, in wrapper
return func(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py", line 208, in heartbeat_callback
raise AirflowException("PID of job runner does not match")
airflow.exceptions.AirflowException: PID of job runner does not match
[2023-05-04 02:28:23,801] {local_executor.py:130} ERROR - Failed to execute task PID of job runner does not match.
Traceback (most recent call last):
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/executors/local_executor.py", line 126, in _execute_work_in_fork
args.func(args)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/cli_parser.py", line 52, in command
return func(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/cli.py", line 103, in wrapper
return f(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 382, in task_run
_run_task_by_selected_method(args, dag, ti)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 189, in _run_task_by_selected_method
_run_task_by_local_task_job(args, ti)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 247, in _run_task_by_local_task_job
run_job.run()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", line 247, in run
self._execute()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py", line 135, in _execute
self.heartbeat()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", line 228, in heartbeat
self.heartbeat_callback(session=session)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/session.py", line 72, in wrapper
return func(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py", line 208, in heartbeat_callback
raise AirflowException("PID of job runner does not match")
airflow.exceptions.AirflowException: PID of job runner does not match
[2023-05-04 02:28:24,815] {dagrun.py:578} ERROR - Marking run <DagRun *** @ 2023-05-04 05:00:00+00:00: scheduled__2023-05-04T05:00:00+00:00, state:running, queued_at: 2023-05-04 06:00:02.373050+00:00. externally triggered: False> failed
[2023-05-04 02:28:24,816] {dagrun.py:644} INFO - DagRun *** dag_id=*** execution_date=2023-05-04 05:00:00+00:00, run_id=scheduled__2023-05-04T05:00:00+00:00, run_start_date=2023-05-04 06:00:02.574912+00:00, run_end_date=2023-05-04 06:28:24.815624+00:00, run_duration=1702.240712, state=failed, external_trigger=False, run_type=scheduled, data_interval_start=2023-05-04 05:00:00+00:00, data_interval_end=2023-05-04 06:00:00+00:00, dag_hash=4d788176fc57f5eb934f9ab5b04a02db
[2023-05-04 02:28:24,839] {dag.py:3336} INFO - Setting next_dagrun for *** to 2023-05-04T06:00:00+00:00, run_after=2023-05-04T07:00:00+00:00
[2023-05-04 02:28:25,750] {local_executor.py:130} ERROR - Failed to execute task PID of job runner does not match.
Traceback (most recent call last):
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/executors/local_executor.py", line 126, in _execute_work_in_fork
args.func(args)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/cli_parser.py", line 52, in command
return func(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/cli.py", line 103, in wrapper
return f(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 382, in task_run
_run_task_by_selected_method(args, dag, ti)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 189, in _run_task_by_selected_method
_run_task_by_local_task_job(args, ti)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 247, in _run_task_by_local_task_job
run_job.run()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", line 247, in run
self._execute()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py", line 135, in _execute
self.heartbeat()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", line 228, in heartbeat
self.heartbeat_callback(session=session)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/session.py", line 72, in wrapper
return func(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py", line 208, in heartbeat_callback
raise AirflowException("PID of job runner does not match")
airflow.exceptions.AirflowException: PID of job runner does not match
[2023-05-04 02:28:25,805] {scheduler_job.py:588} INFO - Executor reports execution of *** run_id=scheduled__2023-05-04T05:00:00+00:00 exited with status failed for try_number 1
[2023-05-04 02:28:25,805] {scheduler_job.py:588} INFO - Executor reports execution of *** run_id=scheduled__2023-05-04T05:00:00+00:00 exited with status failed for try_number 1
[2023-05-04 02:28:25,805] {scheduler_job.py:588} INFO - Executor reports execution of *** run_id=scheduled__2023-05-03T06:00:00+00:00 exited with status failed for try_number 1
[2023-05-04 02:28:25,805] {scheduler_job.py:588} INFO - Executor reports execution of *** run_id=scheduled__2023-05-03T06:00:00+00:00 exited with status failed for try_number 1
[2023-05-04 02:28:25,806] {scheduler_job.py:588} INFO - Executor reports execution of *** run_id=scheduled__2023-05-03T06:00:00+00:00 exited with status failed for try_number 1
[2023-05-04 02:28:25,806] {scheduler_job.py:588} INFO - Executor reports execution of *** run_id=scheduled__2023-05-03T06:00:00+00:00 exited with status failed for try_number 1
[2023-05-04 02:28:25,806] {scheduler_job.py:588} INFO - Executor reports execution of *** run_id=scheduled__2023-05-03T06:00:00+00:00 exited with status failed for try_number 1
[2023-05-04 02:28:25,806] {scheduler_job.py:588} INFO - Executor reports execution of *** run_id=scheduled__2023-05-04T04:05:00+00:00 exited with status failed for try_number 1
[2023-05-04 02:28:25,827] {scheduler_job.py:631} INFO - TaskInstance Finished: dag_id=*** task_id=***, run_id=scheduled__2023-05-04T05:00:00+00:00, map_index=-1, run_start_date=2023-05-04 06:28:18.434524+00:00, run_end_date=2023-05-04 06:28:22.240840+00:00, run_duration=3.80632, state=up_for_retry, executor_state=failed, try_number=1, max_tries=2, job_id=277260, pool=default_pool, queue=default, priority_weight=1, operator=SSHOperator, queued_dttm=2023-05-04 06:28:15.998301+00:00, queued_by_job_id=233035, pid=3738390
[2023-05-04 02:28:25,827] {scheduler_job.py:631} INFO - TaskInstance Finished: dag_id=*** task_id=***, run_id=scheduled__2023-05-04T04:05:00+00:00, map_index=-1, run_start_date=2023-05-04 06:28:15.776811+00:00, run_end_date=2023-05-04 06:28:25.632014+00:00, run_duration=9.8552, state=up_for_retry, executor_state=failed, try_number=1, max_tries=2, job_id=277246, pool=default_pool, queue=default, priority_weight=1, operator=SSHOperator, queued_dttm=2023-05-04 06:28:13.784565+00:00, queued_by_job_id=233035, pid=3738075
[2023-05-04 02:28:25,827] {scheduler_job.py:631} INFO - TaskInstance Finished: dag_id=*** task_id=***, run_id=scheduled__2023-05-04T05:00:00+00:00, map_index=-1, run_start_date=2023-05-04 06:28:15.996686+00:00, run_end_date=2023-05-04 06:28:22.783233+00:00, run_duration=6.78655, state=failed, executor_state=failed, try_number=1, max_tries=0, job_id=277247, pool=default_pool, queue=default, priority_weight=1, operator=SSHOperator, queued_dttm=2023-05-04 06:28:13.784565+00:00, queued_by_job_id=233035, pid=3738153
[2023-05-04 02:28:25,828] {scheduler_job.py:631} INFO - TaskInstance Finished: dag_id=*** task_id=***, run_id=scheduled__2023-05-03T06:00:00+00:00, map_index=-1, run_start_date=2023-05-04 06:28:15.871250+00:00, run_end_date=2023-05-04 06:28:22.188521+00:00, run_duration=6.31727, state=failed, executor_state=failed, try_number=1, max_tries=0, job_id=277245, pool=default_pool, queue=default, priority_weight=2, operator=ExternalTaskSensor, queued_dttm=2023-05-04 06:28:13.784565+00:00, queued_by_job_id=233035, pid=3738084
[2023-05-04 02:28:25,828] {scheduler_job.py:631} INFO - TaskInstance Finished: dag_id=*** task_id=***, run_id=scheduled__2023-05-03T06:00:00+00:00, map_index=-1, run_start_date=2023-05-04 06:28:15.669887+00:00, run_end_date=2023-05-04 06:28:21.537265+00:00, run_duration=5.86738, state=failed, executor_state=failed, try_number=1, max_tries=0, job_id=277239, pool=default_pool, queue=default, priority_weight=2, operator=ExternalTaskSensor, queued_dttm=2023-05-04 06:28:13.784565+00:00, queued_by_job_id=233035, pid=3738064
[2023-05-04 02:28:25,829] {scheduler_job.py:631} INFO - TaskInstance Finished: dag_id=*** task_id=***, run_id=scheduled__2023-05-03T06:00:00+00:00, map_index=-1, run_start_date=2023-05-04 06:28:15.554503+00:00, run_end_date=2023-05-04 06:28:22.238525+00:00, run_duration=6.68402, state=failed, executor_state=failed, try_number=1, max_tries=0, job_id=277240, pool=default_pool, queue=default, priority_weight=2, operator=ExternalTaskSensor, queued_dttm=2023-05-04 06:28:13.784565+00:00, queued_by_job_id=233035, pid=3738047
[2023-05-04 02:28:25,829] {scheduler_job.py:631} INFO - TaskInstance Finished: dag_id=*** task_id=***, run_id=scheduled__2023-05-03T06:00:00+00:00, map_index=-1, run_start_date=2023-05-04 06:28:16.023921+00:00, run_end_date=2023-05-04 06:28:21.406309+00:00, run_duration=5.38239, state=failed, executor_state=failed, try_number=1, max_tries=0, job_id=277253, pool=default_pool, queue=default, priority_weight=2, operator=ExternalTaskSensor, queued_dttm=2023-05-04 06:28:13.784565+00:00, queued_by_job_id=233035, pid=3738156
[2023-05-04 02:28:25,829] {scheduler_job.py:631} INFO - TaskInstance Finished: dag_id=*** task_id=***, run_id=scheduled__2023-05-03T06:00:00+00:00, map_index=-1, run_start_date=2023-05-04 06:28:16.405103+00:00, run_end_date=2023-05-04 06:28:21.631632+00:00, run_duration=5.22653, state=failed, executor_state=failed, try_number=1, max_tries=0, job_id=277252, pool=default_pool, queue=default, priority_weight=2, operator=ExternalTaskSensor, queued_dttm=2023-05-04 06:28:13.784565+00:00, queued_by_job_id=233035, pid=3738180
[2023-05-04 02:28:25,891] {local_executor.py:130} ERROR - Failed to execute task PID of job runner does not match.
Traceback (most recent call last):
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/executors/local_executor.py", line 126, in _execute_work_in_fork
args.func(args)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/cli_parser.py", line 52, in command
return func(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/cli.py", line 103, in wrapper
return f(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 382, in task_run
_run_task_by_selected_method(args, dag, ti)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 189, in _run_task_by_selected_method
_run_task_by_local_task_job(args, ti)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 247, in _run_task_by_local_task_job
run_job.run()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", line 247, in run
self._execute()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py", line 135, in _execute
self.heartbeat()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", line 228, in heartbeat
self.heartbeat_callback(session=session)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/session.py", line 72, in wrapper
return func(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py", line 208, in heartbeat_callback
raise AirflowException("PID of job runner does not match")
airflow.exceptions.AirflowException: PID of job runner does not match
[2023-05-04 02:28:26,251] {local_executor.py:130} ERROR - Failed to execute task PID of job runner does not match.
Traceback (most recent call last):
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/executors/local_executor.py", line 126, in _execute_work_in_fork
args.func(args)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/cli_parser.py", line 52, in command
return func(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/cli.py", line 103, in wrapper
return f(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 382, in task_run
_run_task_by_selected_method(args, dag, ti)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 189, in _run_task_by_selected_method
_run_task_by_local_task_job(args, ti)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 247, in _run_task_by_local_task_job
run_job.run()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", line 247, in run
self._execute()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py", line 135, in _execute
self.heartbeat()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", line 228, in heartbeat
self.heartbeat_callback(session=session)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/session.py", line 72, in wrapper
return func(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py", line 208, in heartbeat_callback
raise AirflowException("PID of job runner does not match")
airflow.exceptions.AirflowException: PID of job runner does not match
[2023-05-04 02:28:27,011] {dagrun.py:578} ERROR - Marking run <DagRun *** @ 2023-04-16 00:00:00+00:00: scheduled__2023-04-16T00:00:00+00:00, state:running, queued_at: 2023-05-01 00:00:06.474932+00:00. externally triggered: False> failed
[2023-05-04 02:28:27,012] {dagrun.py:644} INFO - DagRun *** dag_id=*** execution_date=2023-04-16 00:00:00+00:00, run_id=scheduled__2023-04-16T00:00:00+00:00, run_start_date=2023-05-01 00:00:06.606220+00:00, run_end_date=2023-05-04 06:28:27.012156+00:00, run_duration=282500.405936, state=failed, external_trigger=False, run_type=scheduled, data_interval_start=2023-04-16 00:00:00+00:00, data_interval_end=2023-05-01 00:00:00+00:00, dag_hash=71706b64a19537a7749aa8af2dd1f8e0
[2023-05-04 02:28:27,021] {dag.py:3336} INFO - Setting next_dagrun for *** to 2023-05-01T00:00:00+00:00, run_after=2023-05-16T00:00:00+00:00
[2023-05-04 02:28:28,203] {scheduler_job.py:588} INFO - Executor reports execution of *** run_id=scheduled__2023-05-03T20:05:00+00:00 exited with status failed for try_number 1
[2023-05-04 02:28:28,203] {scheduler_job.py:588} INFO - Executor reports execution of *** run_id=scheduled__2023-04-16T00:00:00+00:00 exited with status failed for try_number 1
[2023-05-04 02:28:28,212] {scheduler_job.py:631} INFO - TaskInstance Finished: dag_id=*** task_id=***, run_id=scheduled__2023-04-16T00:00:00+00:00, map_index=-1, run_start_date=2023-05-04 06:28:20.053437+00:00, run_end_date=2023-05-04 06:28:25.618675+00:00, run_duration=5.56524, state=failed, executor_state=failed, try_number=1, max_tries=0, job_id=277266, pool=default_pool, queue=default, priority_weight=1, operator=SSHOperator, queued_dttm=2023-05-04 06:28:18.736412+00:00, queued_by_job_id=233035, pid=3738527
[2023-05-04 02:28:28,213] {scheduler_job.py:631} INFO - TaskInstance Finished: dag_id=*** task_id=***, run_id=scheduled__2023-05-03T20:05:00+00:00, map_index=-1, run_start_date=2023-05-04 06:28:16.315525+00:00, run_end_date=2023-05-04 06:28:25.669738+00:00, run_duration=9.35421, state=up_for_retry, executor_state=failed, try_number=1, max_tries=2, job_id=277256, pool=default_pool, queue=default, priority_weight=1, operator=SSHOperator, queued_dttm=2023-05-04 06:28:13.784565+00:00, queued_by_job_id=233035, pid=3738170
[2023-05-04 02:28:29,882] {scheduler_job.py:346} INFO - 1 tasks up for execution:
<TaskInstance: *** scheduled__2023-05-02T20:00:00+00:00 [scheduled]>
[2023-05-04 02:28:29,882] {scheduler_job.py:411} INFO - DAG *** has 1/50 running and queued tasks
[2023-05-04 02:28:29,882] {scheduler_job.py:497} INFO - Setting the following tasks to queued state:
<TaskInstance: *** scheduled__2023-05-02T20:00:00+00:00 [scheduled]>
[2023-05-04 02:28:29,887] {scheduler_job.py:536} INFO - Sending TaskInstanceKey(dag_id=*** ***, run_id='scheduled__2023-05-02T20:00:00+00:00', try_number=1, map_index=-1) to executor with priority 1 and queue default
[2023-05-04 02:28:29,887] {base_executor.py:95} INFO - Adding to queue: ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-02T20:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:29,905] {local_executor.py:81} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', '***', '***', 'scheduled__2023-05-02T20:00:00+00:00', '--local', '--subdir', 'DAGS_FOLDER/***.py']
[2023-05-04 02:28:30,010] {dagbag.py:537} INFO - Filling up the DagBag from /mnt/alpha/airflow/dags/***.py
[2023-05-04 02:28:31,085] {base.py:71} INFO - Using connection ID ssh_*** for task execution.
[2023-05-04 02:28:31,720] {task_command.py:376} INFO - Running <TaskInstance: *** scheduled__2023-05-02T20:00:00+00:00 [queued]> on host ***
[2023-05-04 02:28:32,005] {scheduler_job.py:588} INFO - Executor reports execution of *** run_id=scheduled__2023-05-02T20:00:00+00:00 exited with status success for try_number 1
[2023-05-04 02:28:32,014] {scheduler_job.py:631} INFO - TaskInstance Finished: dag_id=*** task_id=***, run_id=scheduled__2023-05-02T20:00:00+00:00, map_index=-1, run_start_date=2023-05-04 06:28:18.191068+00:00, run_end_date=2023-05-04 06:28:28.899988+00:00, run_duration=10.7089, state=failed, executor_state=success, try_number=1, max_tries=1, job_id=277259, pool=default_pool, queue=default, priority_weight=3, operator=SSHOperator, queued_dttm=2023-05-04 06:28:15.998301+00:00, queued_by_job_id=233035, pid=3738373
[2023-05-04 02:28:38,357] {scheduler_job.py:588} INFO - Executor reports execution of *** run_id=scheduled__2023-05-02T20:00:00+00:00 exited with status success for try_number 1
[2023-05-04 02:28:38,396] {scheduler_job.py:631} INFO - TaskInstance Finished: dag_id=*** task_id=***, run_id=scheduled__2023-05-02T20:00:00+00:00, map_index=-1, run_start_date=2023-05-04 06:28:31.867672+00:00, run_end_date=2023-05-04 06:28:33.320185+00:00, run_duration=1.45251, state=success, executor_state=success, try_number=1, max_tries=1, job_id=277272, pool=default_pool, queue=default, priority_weight=1, operator=PythonOperator, queued_dttm=2023-05-04 06:28:29.883257+00:00, queued_by_job_id=233035, pid=3739407
[2023-05-04 02:28:40,472] {local_executor.py:130} ERROR - Failed to execute task PID of job runner does not match.
Traceback (most recent call last):
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/executors/local_executor.py", line 126, in _execute_work_in_fork
args.func(args)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/cli_parser.py", line 52, in command
return func(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/cli.py", line 103, in wrapper
return f(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 382, in task_run
_run_task_by_selected_method(args, dag, ti)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 189, in _run_task_by_selected_method
_run_task_by_local_task_job(args, ti)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 247, in _run_task_by_local_task_job
run_job.run()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", line 247, in run
self._execute()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py", line 135, in _execute
self.heartbeat()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", line 228, in heartbeat
self.heartbeat_callback(session=session)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/session.py", line 72, in wrapper
return func(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py", line 208, in heartbeat_callback
raise AirflowException("PID of job runner does not match")
airflow.exceptions.AirflowException: PID of job runner does not match
[2023-05-04 02:28:41,951] {dagrun.py:578} ERROR - Marking run <DagRun *** @ 2023-05-02 20:00:00+00:00: scheduled__2023-05-02T20:00:00+00:00, state:running, queued_at: 2023-05-03 20:00:01.228399+00:00. externally triggered: False> failed
[2023-05-04 02:28:41,952] {dagrun.py:644} INFO - DagRun *** dag_id=*** execution_date=2023-05-02 20:00:00+00:00, run_id=scheduled__2023-05-02T20:00:00+00:00, run_start_date=2023-05-03 20:00:01.913706+00:00, run_end_date=2023-05-04 06:28:41.952025+00:00, run_duration=37720.038319, state=failed, external_trigger=False, run_type=scheduled, data_interval_start=2023-05-02 20:00:00+00:00, data_interval_end=2023-05-03 20:00:00+00:00, dag_hash=07a45c48201bfc14afb9d7e4c643bbdb
[2023-05-04 02:28:41,980] {dag.py:3336} INFO - Setting next_dagrun for *** to 2023-05-03T20:00:00+00:00, run_after=2023-05-04T20:00:00+00:00
[2023-05-04 02:28:42,422] {local_executor.py:130} ERROR - Failed to execute task PID of job runner does not match.
Traceback (most recent call last):
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/executors/local_executor.py", line 126, in _execute_work_in_fork
args.func(args)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/cli_parser.py", line 52, in command
return func(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/cli.py", line 103, in wrapper
return f(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 382, in task_run
_run_task_by_selected_method(args, dag, ti)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 189, in _run_task_by_selected_method
_run_task_by_local_task_job(args, ti)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 247, in _run_task_by_local_task_job
run_job.run()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", line 247, in run
self._execute()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py", line 135, in _execute
self.heartbeat()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", line 228, in heartbeat
self.heartbeat_callback(session=session)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/session.py", line 72, in wrapper
return func(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py", line 208, in heartbeat_callback
raise AirflowException("PID of job runner does not match")
airflow.exceptions.AirflowException: PID of job runner does not match
[2023-05-04 02:28:42,502] {scheduler_job.py:588} INFO - Executor reports execution of *** run_id=scheduled__2023-05-03T01:03:00+00:00 exited with status failed for try_number 1
[2023-05-04 02:28:42,502] {scheduler_job.py:588} INFO - Executor reports execution of *** run_id=scheduled__2023-05-04T06:20:00+00:00 exited with status failed for try_number 1
[2023-05-04 02:28:42,511] {scheduler_job.py:631} INFO - TaskInstance Finished: dag_id=*** task_id=***, run_id=scheduled__2023-05-03T01:03:00+00:00, map_index=-1, run_start_date=2023-05-04 06:28:16.302772+00:00, run_end_date=2023-05-04 06:28:38.024822+00:00, run_duration=21.722, state=failed, executor_state=failed, try_number=1, max_tries=0, job_id=277255, pool=default_pool, queue=default, priority_weight=2, operator=SSHOperator, queued_dttm=2023-05-04 06:28:13.784565+00:00, queued_by_job_id=233035, pid=3738168
[2023-05-04 02:28:42,511] {scheduler_job.py:631} INFO - TaskInstance Finished: dag_id=*** task_id=***, run_id=scheduled__2023-05-04T06:20:00+00:00, map_index=-1, run_start_date=2023-05-04 06:28:16.282425+00:00, run_end_date=2023-05-04 06:28:41.372880+00:00, run_duration=25.0905, state=failed, executor_state=failed, try_number=1, max_tries=0, job_id=277251, pool=default_pool, queue=default, priority_weight=1, operator=SSHOperator, queued_dttm=2023-05-04 06:28:13.784565+00:00, queued_by_job_id=233035, pid=3738169
[2023-05-04 02:28:42,661] {local_executor.py:130} ERROR - Failed to execute task PID of job runner does not match.
Traceback (most recent call last):
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/executors/local_executor.py", line 126, in _execute_work_in_fork
args.func(args)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/cli_parser.py", line 52, in command
return func(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/cli.py", line 103, in wrapper
return f(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 382, in task_run
_run_task_by_selected_method(args, dag, ti)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 189, in _run_task_by_selected_method
_run_task_by_local_task_job(args, ti)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 247, in _run_task_by_local_task_job
run_job.run()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", line 247, in run
self._execute()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py", line 135, in _execute
self.heartbeat()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", line 228, in heartbeat
self.heartbeat_callback(session=session)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/session.py", line 72, in wrapper
return func(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py", line 208, in heartbeat_callback
raise AirflowException("PID of job runner does not match")
airflow.exceptions.AirflowException: PID of job runner does not match
[2023-05-04 02:28:43,636] {dagrun.py:578} ERROR - Marking run <DagRun *** @ 2023-05-04 06:20:00+00:00: scheduled__2023-05-04T06:20:00+00:00, state:running, queued_at: 2023-05-04 06:25:33.747936+00:00. externally triggered: False> failed
[2023-05-04 02:28:43,637] {dagrun.py:644} INFO - DagRun *** dag_id=*** execution_date=2023-05-04 06:20:00+00:00, run_id=scheduled__2023-05-04T06:20:00+00:00, run_start_date=2023-05-04 06:25:33.806284+00:00, run_end_date=2023-05-04 06:28:43.637144+00:00, run_duration=189.83086, state=failed, external_trigger=False, run_type=scheduled, data_interval_start=2023-05-04 06:20:00+00:00, data_interval_end=2023-05-04 06:25:00+00:00, dag_hash=c6c806de8a6d29d37d214803630a2667
[2023-05-04 02:28:43,646] {dag.py:3336} INFO - Setting next_dagrun for *** to 2023-05-04T06:25:00+00:00, run_after=2023-05-04T06:30:00+00:00
[2023-05-04 02:28:44,586] {scheduler_job.py:588} INFO - Executor reports execution of *** run_id=scheduled__2023-05-04T05:05:00+00:00 exited with status failed for try_number 1
[2023-05-04 02:28:44,595] {scheduler_job.py:631} INFO - TaskInstance Finished: dag_id=*** task_id=***, run_id=scheduled__2023-05-04T05:05:00+00:00, map_index=-1, run_start_date=2023-05-04 06:28:21.528300+00:00, run_end_date=2023-05-04 06:28:42.518193+00:00, run_duration=20.9899, state=up_for_retry, executor_state=failed, try_number=1, max_tries=3, job_id=277270, pool=default_pool, queue=default, priority_weight=3, operator=SSHOperator, queued_dttm=2023-05-04 06:28:15.998301+00:00, queued_by_job_id=233035, pid=3738666
[2023-05-04 02:28:46,942] {dagrun.py:578} ERROR - Marking run <DagRun *** @ 2023-05-03 01:03:00+00:00: scheduled__2023-05-03T01:03:00+00:00, state:running, queued_at: 2023-05-04 01:03:04.048892+00:00. externally triggered: False> failed
[2023-05-04 02:28:46,942] {dagrun.py:644} INFO - DagRun *** dag_id=*** execution_date=2023-05-03 01:03:00+00:00, run_id=scheduled__2023-05-03T01:03:00+00:00, run_start_date=2023-05-04 01:03:04.421634+00:00, run_end_date=2023-05-04 06:28:46.942643+00:00, run_duration=19542.521009, state=failed, external_trigger=False, run_type=scheduled, data_interval_start=2023-05-03 01:03:00+00:00, data_interval_end=2023-05-04 01:03:00+00:00, dag_hash=5567f4afd9ffba286e0af33d0abbabd7
[2023-05-04 02:28:46,985] {dag.py:3336} INFO - Setting next_dagrun for *** to 2023-05-04T01:03:00+00:00, run_after=2023-05-05T01:03:00+00:00
[2023-05-04 02:28:47,298] {local_executor.py:130} ERROR - Failed to execute task PID of job runner does not match.
Traceback (most recent call last):
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/executors/local_executor.py", line 126, in _execute_work_in_fork
args.func(args)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/cli_parser.py", line 52, in command
return func(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/cli.py", line 103, in wrapper
return f(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 382, in task_run
_run_task_by_selected_method(args, dag, ti)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 189, in _run_task_by_selected_method
_run_task_by_local_task_job(args, ti)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/cli/commands/task_command.py", line 247, in _run_task_by_local_task_job
run_job.run()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", line 247, in run
self._execute()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py", line 135, in _execute
self.heartbeat()
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/base_job.py", line 228, in heartbeat
self.heartbeat_callback(session=session)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/utils/session.py", line 72, in wrapper
return func(*args, **kwargs)
File "/mnt/alpha/miniconda/lib/python3.10/site-packages/airflow/jobs/local_task_job.py", line 208, in heartbeat_callback
raise AirflowException("PID of job runner does not match")
airflow.exceptions.AirflowException: PID of job runner does not match
uranusjr can you add 2.4.3 also as an affected version ?
@uranusjr can you add 2.4.3 also as an affected version ?
Could you tell more about you case @shubhampatel94 ? did it happen once, ? Did it start to happen continuously? Did you (or the deployment that you are running) experienced some kind of event (restart, being terminated, or similar?)
I am just trying to see if there is someone who can explain the circumstances it happens. IT does not seem a common occurence, people are experiencing it occasionally and I think it is caused by some race condition involved in starting and TERMINATING processes quickly
If it happened once and it was accompanied by some deployment issue that caused termination of running processes, I would not be surprised to see similar issue.
@shubhampatel94 and @potiuk -
Note that since (my) patch was applied that fixed the race condition, I have occasionally seen this error when the process was killed for another reason - for example, we have OS monitors that will kill processes that are being bad citizens, or certain times when the task had an unexpected exception and died by itself.
I've verified in these cases that the task failed to complete and Airflow throws this error message, but Airflow is not the ultimate culprit that is terminating the tasks, and this is the message terminating the containing process of the dead task. Is it possible that this is what is happening with you?
At some point I was going to try to dig deeper to verify the situation and propose a better way to identify it and send a better error message, but I haven't had the time.
Note that since (my) patch was applied that fixed the race condition, I have occasionally seen this error when the process was killed for another reason - for example, we have OS monitors that will kill processes that are being bad citizens, or certain times when the task had an unexpected exception and died by itself.
Precisely. What you explains is what I suspected. Very rare event that is externally triggered. That's how it looks like from the logs. It actually looks like something just killed a bunch of tasks running but the original local task jobs have not been killed and then it complained about those "child" processes missing there. If that is happening only occasionally as a resut of some unbounded killing of processes. I would be for just closing this one. We are not able to handle all the scenarios when somethign randomly kills some processes. Airlfow is not a 99.999% available system that is supposed to handle absolutely all such situations - this is extremely costly to develop such systems, and there is little incentive to spend a lot of time on perfecting it, when there is nice UI and monitoring that can warn in such situations and have a human to fix it by re-running the tasks.
@potiuk @kcphila Thanks for looking into what I have pointed out. It just happen once till now and only happen for a 6 min window and stabilized after that. The trigger point I am assuming is
DagFileProcessorManager (PID=3732979) exited with exit code 1 - re-launching
What I have observed is that airflow terminated all running tasks within 6 min the Window, and then it was business as usual.
So I woul dnot really worry about it. I think we can close that one and we might re-open it if we have more reproducible cases other than occasional failures like that (which just might happen).
Thanks, @potiuk will re-raise again if the observed issue again.
Apache Airflow version: 2.1.3
Apache Airflow Provider versions (please include all providers that are relevant to your bug):
Environment:
I'm using the airflow-2.1.2 container from dockerhub.
Debian GNU/Linux 10 (buster)
uname -a
):Linux fe52079d9ade 5.12.13-200.fc33.x86_64 #1 SMP Wed Jun 23 16:20:26 UTC 2021 x86_64 GNU/Linux
What happened:
When using the EMRStepSensor (set to reschedule mode) to monitor EMR steps, the task will sometimes fail while the EMR step sucessfully ran. Most of the time the sensor will work fine, but every so often this issue occurs (on the same DAG, without modifications).
EMRStepSensor task instance debug log
``` *** Reading local file: /opt/airflow/logs/derived.adobe_target_catalog_sporza/watch_adobe_target_catalog_sporza_job_emr_step/2021-08-07T05:28:00+00:00/1.log [2021-08-08 05:29:20,367] {__init__.py:51} DEBUG - Loading core task runner: StandardTaskRunner [2021-08-08 05:29:21,594] {base_task_runner.py:62} DEBUG - Planning to run as the user [2021-08-08 05:29:21,597] {taskinstance.py:614} DEBUG - Refreshing TaskInstanceWhat you expected to happen:
I'd expect the EMRStepSensor to run until the EMR step succeeded, and report a succesful run.
If my understanding is correct, these final lines in the log show the runner terminating the task process. If I'm reading the log correctly, 8339 is the correct PID for the task, and the recorded pid 7972 is the pid for a previous run. Could it be possible that this pid is not correctly being updated?
Anything else we need to know:
The symptoms look very similar to #17394, but I'm not using run_as_user, and the reported pids are not the same, so I'm not sure whether this is the same issue.