Open Isarien opened 4 years ago
Waiting task does not catch all error. In my case, I inadvertently launch a missing file. The databricks job fail with an internal error, but the task does not catch it, so it continue to wait.
Here is the JSON response :
{ "job_id": 1, "run_id": 1, "number_in_job": 1, "original_attempt_run_id": 1, "state": { "life_cycle_state": "INTERNAL_ERROR", "state_message": "Notebook not found: /Shared/main.py" }, "task": { "notebook_task": { "notebook_path": "/Shared/main.py" } }, "cluster_spec": { "new_cluster": { "spark_version": "5.3.x-scala2.11", "node_type_id": "Standard_DS3_v2", "enable_elastic_disk": true, "num_workers": 2 } }, "start_time": 1570602949211, "setup_duration": 0, "execution_duration": 0, "cleanup_duration": 0, "trigger": "ONE_TIME", "creator_user_name": "john_doe@john_doe.fr", "run_name": "AzDO Execution", "run_page_url": "https://westeurope.azuredatabricks.net/?o=9999999999999#job/1/run/1", "run_type": "JOB_RUN" }
Waiting task does not catch all error. In my case, I inadvertently launch a missing file. The databricks job fail with an internal error, but the task does not catch it, so it continue to wait.
Here is the JSON response :