apache / airflow

Apache Airflow - A platform to programmatically author, schedule, and monitor workflows
https://airflow.apache.org/
Apache License 2.0
37.43k stars 14.36k forks source link

airflow scheduling Checking dependencies on 2 tasks instances, minus 0 skippable ones #1189

Closed basebase closed 8 years ago

basebase commented 8 years ago

Hello All: I'm use scheduling my dag command: nohup airflow scheduler -d bigscreen_daily_job & task state Has been Running,see nohup log

`[2016-03-22 10:25:16,169] {jobs.py:455} INFO - Getting list of tasks to skip for active runs.

[2016-03-22 10:25:16,174] {jobs.py:470} INFO - Checking dependencies on 2 tasks instances, minus 0 skippable ones

[2016-03-22 10:25:16,216] {jobs.py:633} INFO - Done queuing tasks, calling the executor's heartbeat [2016-03-22 10:25:16,216] {jobs.py:636} INFO - Loop took: 0.20188 seconds [2016-03-22 10:25:16,219] {models.py:222} INFO - Finding 'running' jobs without a recent heartbeat [2016-03-22 10:25:16,220] {models.py:228} INFO - Failing jobs without heartbeat after 2016-03-22 10:23:13.220485 [2016-03-22 10:25:21,029] {jobs.py:507} INFO - Prioritizing 0 queued jobs [2016-03-22 10:25:21,164] {models.py:2156} INFO - Checking state for <DagRun bigscreen_daily_job @ 2016-03-21 00:00:00: scheduled2016-03-21T00:00:00, externally triggered: False> [2016-03-22 10:25:21,169] {models.py:2156} INFO - Checking state for <DagRun bigscreen_daily_job @ 2016-03-21 02:20:00: scheduled2016-03-21T02:20:00, externally triggered: False> [2016-03-22 10:25:21,174] {jobs.py:455} INFO - Getting list of tasks to skip for active runs. [2016-03-22 10:25:21,179] {jobs.py:470} INFO - Checking dependencies on 2 tasks instances, minus 0 skippable ones [2016-03-22 10:25:21,222] {jobs.py:633} INFO - Done queuing tasks, calling the executor's heartbeat [2016-03-22 10:25:21,222] {jobs.py:636} INFO - Loop took: 0.20753 seconds [2016-03-22 10:25:21,225] {models.py:222} INFO - Finding 'running' jobs without a recent heartbeat [2016-03-22 10:25:21,226] {models.py:228} INFO - Failing jobs without heartbeat after 2016-03-22 10:23:18.226460 [2016-03-22 10:25:26,025] {jobs.py:507} INFO - Prioritizing 0 queued jobs`

What are the solutions it?

Thank

r39132 commented 8 years ago

@basebase

  1. Provide example code
  2. Verify you are running on UTC everywhere
  3. What version of the code are you running? A release, private fork, or head master?
  4. Provide image snapshots of your Graph and Tree views if you find that useful
  5. Systems details (e.g. which executor?)
basebase commented 8 years ago

@r39132

1.My test code

`#!/usr/bin/env python
# -*- coding: UTF-8 -*-
#

from airflow import DAG
from airflow.operators import BashOperator, DummyOperator
from datetime import datetime, timedelta
import logging

default_args = {
    'owner': 'jf',
    'depends_on_past': True,
    'start_date': datetime(2016, 03, 23),
    'retries': 1,
    'retry_delay': timedelta(minutes=5)
    # 'queue': 'bash_queue',
    # 'pool': 'backfill',
    # 'priority_weight': 10,
    # 'end_date': datetime(2016, 1, 1),
}

dag = DAG('bigscreen_daily_job', default_args=default_args, schedule_interval='0 2 * * *')

bigscreen_daily_task = BashOperator(
    task_id='bigscreen_daily_task',
    bash_command='python /data/airflow/dags/bigscreen/main.py {{ ds }} /data/production/nginx/www/bigscreen',
    dag=dag
)`
  1. airflow page time is UTC time,I dont github issues upload img, sorry 3.airflow version -> v1.6.2

4.CentOS release 6.7 (Final)

I english Bad,Some rely on translation

jlowin commented 8 years ago

@basebase please try again using the code in GitHub master. I think #1271 (just merged) might address your issue.

bolkedebruin commented 8 years ago

Please open Jira if issue persists