Closed aru-trackunit closed 8 months ago
Indeed, a Databricks.Row.__fields__
is itself another Row object when the row is created in two steps. Fix for that is also coming in #36949
Thanks for the debug print !
@Joffreybvn Just retested with Databricks provider 6.1.0
on airflow 2.8.1
and the issue still persists with different stacktrace:
[2024-01-29, 13:41:53 CET] {taskinstance.py:1956} INFO - Dependencies all met for dep_context=non-requeueable deps ti=<TaskInstance: task_1.read manual__2024-01-29T13:41:35+01:00 [queued]>
[2024-01-29, 13:41:53 CET] {taskinstance.py:1956} INFO - Dependencies all met for dep_context=requeueable deps ti=<TaskInstance: task_1.read manual__2024-01-29T13:41:35+01:00 [queued]>
[2024-01-29, 13:41:53 CET] {taskinstance.py:2170} INFO - Starting attempt 1 of 1
[2024-01-29, 13:41:53 CET] {taskinstance.py:2191} INFO - Executing <Task(DatabricksSqlOperator): read> on 2024-01-29 12:41:35+00:00
[2024-01-29, 13:41:53 CET] {standard_task_runner.py:60} INFO - Started process 131 to run task
[2024-01-29, 13:41:53 CET] {standard_task_runner.py:87} INFO - Running: ['airflow', 'tasks', 'run', 'task_1', 'read', 'manual__2024-01-29T13:41:35+01:00', '--job-id', '6', '--raw', '--subdir', 'DAGS_FOLDER/dag-wn-equipment.py', '--cfg-path', '/tmp/tmpq_30xj_j']
[2024-01-29, 13:41:53 CET] {standard_task_runner.py:88} INFO - Job 6: Subtask read
[2024-01-29, 13:41:53 CET] {task_command.py:423} INFO - Running <TaskInstance: task_1.read manual__2024-01-29T13:41:35+01:00 [running]> on host 33e1fb1e4ed5
[2024-01-29, 13:41:53 CET] {taskinstance.py:2480} INFO - Exporting env vars: AIRFLOW_CTX_DAG_OWNER='team_analytics' AIRFLOW_CTX_DAG_ID='task_1' AIRFLOW_CTX_TASK_ID='read' AIRFLOW_CTX_EXECUTION_DATE='2024-01-29T12:41:35+00:00' AIRFLOW_CTX_TRY_NUMBER='1' AIRFLOW_CTX_DAG_RUN_ID='manual__2024-01-29T13:41:35+01:00'
[2024-01-29, 13:41:53 CET] {sql.py:276} INFO - Executing: SELECT * FROM catalog.schema.test_table LIMIT 10;
[2024-01-29, 13:41:53 CET] {base.py:83} INFO - Using connection ID 'tu-databricks-sp' for task execution.
[2024-01-29, 13:41:54 CET] {databricks_base.py:514} INFO - Using Service Principal Token.
[2024-01-29, 13:41:54 CET] {databricks_base.py:223} INFO - Existing Service Principal token is expired, or going to expire soon. Refreshing...
[2024-01-29, 13:41:54 CET] {databricks_base.py:514} INFO - Using Service Principal Token.
[2024-01-29, 13:41:55 CET] {client.py:200} INFO - Successfully opened session 01eebea3-c8a3-1606-b957-b38c0426a2d7
[2024-01-29, 13:41:55 CET] {sql.py:450} INFO - Running statement: SELECT * FROM catalog.schema.test_table LIMIT 10, parameters: None
[2024-01-29, 13:41:57 CET] {client.py:258} INFO - Closing session 01eebea3-c8a3-1606-b957-b38c0426a2d7
[2024-01-29, 13:41:57 CET] {xcom.py:664} ERROR - Object of type tuple is not JSON serializable. If you are using pickle instead of JSON for XCom, then you need to enable pickle support for XCom in your airflow config or make sure to decorate your object with attr.
[2024-01-29, 13:41:57 CET] {taskinstance.py:2698} ERROR - Task failed with exception
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/utils/json.py", line 91, in default
return serialize(o)
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/serialization/serde.py", line 147, in serialize
return encode(classname, version, serialize(data, depth + 1))
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/serialization/serde.py", line 126, in serialize
return [serialize(d, depth + 1) for d in o]
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/serialization/serde.py", line 126, in <listcomp>
return [serialize(d, depth + 1) for d in o]
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/serialization/serde.py", line 126, in serialize
return [serialize(d, depth + 1) for d in o]
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/serialization/serde.py", line 126, in <listcomp>
return [serialize(d, depth + 1) for d in o]
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/serialization/serde.py", line 180, in serialize
raise TypeError(f"cannot serialize object of type {cls}")
TypeError: cannot serialize object of type <class 'airflow.providers.databricks.hooks.databricks_sql.Row'>
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/models/taskinstance.py", line 440, in _execute_task
task_instance.xcom_push(key=XCOM_RETURN_KEY, value=xcom_value, session=session)
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/utils/session.py", line 76, in wrapper
return func(*args, **kwargs)
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/models/taskinstance.py", line 2980, in xcom_push
XCom.set(
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/utils/session.py", line 76, in wrapper
return func(*args, **kwargs)
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/models/xcom.py", line 247, in set
value = cls.serialize_value(
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/models/xcom.py", line 662, in serialize_value
return json.dumps(value, cls=XComEncoder).encode("UTF-8")
File "/usr/local/lib/python3.10/json/__init__.py", line 238, in dumps
**kw).encode(obj)
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/utils/json.py", line 102, in encode
o = self.default(o)
File "/home/airflow/.local/lib/python3.10/site-packages/airflow/utils/json.py", line 93, in default
return super().default(o)
File "/usr/local/lib/python3.10/json/encoder.py", line 179, in default
raise TypeError(f'Object of type {o.__class__.__name__} '
TypeError: Object of type tuple is not JSON serializable
[2024-01-29, 13:41:57 CET] {taskinstance.py:1138} INFO - Marking task as FAILED. dag_id=task_1, task_id=read, execution_date=20240129T124135, start_date=20240129T124153, end_date=20240129T124157
[2024-01-29, 13:41:57 CET] {standard_task_runner.py:107} ERROR - Failed to execute job 6 for task read (Object of type tuple is not JSON serializable; 131)
I could not reproduce your error. But I found another bug where a single namedtuple 'Row' fails to be serialized, while a list of those namedtuples works fine. Is your query returning only one result?
Hi there. I'm running into this same issue. I'm running a merge statement that only returns one row with four columns.
Appreciate your attention to this.
Old versions: no error apache-airflow-providers-databricks -- 5.0.1 Airflow -- v2.5.0
New versions; getting error apache-airflow-providers-databricks -- 6.1.0 Airflow -- v2.7.3
[2024-02-01, 15:28:06 UTC] {client.py:258} INFO - Closing session 01eec116-6bad-1ab1-9d48-941cb79ab654
[2024-02-01, 15:28:07 UTC] {xcom.py:661} ERROR - Object of type tuple is not JSON serializable. If you are using pickle instead of JSON for XCom, then you need to enable pickle support for XCom in your airflow config or make sure to decorate your object with attr.
[2024-02-01, 15:28:07 UTC] {base.py:73} INFO - Using connection ID 'databricks' for task execution.
[2024-02-01, 15:28:07 UTC] {taskinstance.py:1937} ERROR - Task failed with exception
Traceback (most recent call last):
File "/usr/local/lib/python3.10/site-packages/airflow/utils/json.py", line 91, in default
return serialize(o)
File "/usr/local/lib/python3.10/site-packages/airflow/serialization/serde.py", line 145, in serialize
return encode(classname, version, serialize(data, depth + 1))
File "/usr/local/lib/python3.10/site-packages/airflow/serialization/serde.py", line 124, in serialize
return [serialize(d, depth + 1) for d in o]
File "/usr/local/lib/python3.10/site-packages/airflow/serialization/serde.py", line 124, in <listcomp>
return [serialize(d, depth + 1) for d in o]
File "/usr/local/lib/python3.10/site-packages/airflow/serialization/serde.py", line 124, in serialize
return [serialize(d, depth + 1) for d in o]
File "/usr/local/lib/python3.10/site-packages/airflow/serialization/serde.py", line 124, in <listcomp>
return [serialize(d, depth + 1) for d in o]
File "/usr/local/lib/python3.10/site-packages/airflow/serialization/serde.py", line 178, in serialize
raise TypeError(f"cannot serialize object of type {cls}")
TypeError: cannot serialize object of type <class 'airflow.providers.databricks.hooks.databricks_sql.Row'>
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.10/site-packages/airflow/utils/session.py", line 76, in wrapper
return func(*args, **kwargs)
File "/usr/local/lib/python3.10/site-packages/airflow/models/taskinstance.py", line 2479, in xcom_push
XCom.set(
File "/usr/local/lib/python3.10/site-packages/airflow/utils/session.py", line 76, in wrapper
return func(*args, **kwargs)
File "/usr/local/lib/python3.10/site-packages/airflow/models/xcom.py", line 244, in set
value = cls.serialize_value(
File "/usr/local/lib/python3.10/site-packages/airflow/models/xcom.py", line 659, in serialize_value
return json.dumps(value, cls=XComEncoder).encode("UTF-8")
File "/usr/local/lib/python3.10/json/__init__.py", line 238, in dumps
**kw).encode(obj)
File "/usr/local/lib/python3.10/site-packages/airflow/utils/json.py", line 102, in encode
o = self.default(o)
File "/usr/local/lib/python3.10/site-packages/airflow/utils/json.py", line 93, in default
return super().default(o)
File "/usr/local/lib/python3.10/json/encoder.py", line 179, in default
raise TypeError(f'Object of type {o.__class__.__name__} '
TypeError: Object of type tuple is not JSON serializable
[2024-02-01, 15:28:07 UTC] {taskinstance.py:1400} INFO - Marking task as FAILED. dag_id=Aggregates_Hit, task_id=update_tm_hit_perc, execution_date=20240201T152717, start_date=20240201T152732, end_date=20240201T152807
[2024-02-01, 15:28:07 UTC] {standard_task_runner.py:104} ERROR - Failed to execute job 2054213 for task update_tm_hit_perc (Object of type tuple is not JSON serializable; 31)
[2024-02-01, 15:28:07 UTC] {local_task_job_runner.py:228} INFO - Task exited with return code 1
[2024-02-01, 15:28:07 UTC] {taskinstance.py:2778} INFO - 0 downstream tasks scheduled from follow-on schedule check
@odykstra any resolution on this issue? I'm also getting this error
@odykstra any resolution on this issue? I'm also getting this error
@avivshafir The newest versions should have fixed this issue. So check your versions and upgrade if possible. The other choice is to make your databricks provider version explicit in the requirements.txt file.
I have my version held like so:
apache-airflow-providers-databricks==6.6.0
but I know anything in version 5 does not have the issue also.
Apache Airflow Provider(s)
databricks
Versions of Apache Airflow Providers
The error has not been present in version apache-airflow-providers-databricks==4.7.0 I upgraded to the latest and it is presentapache-airflow-providers-databricks==6.0.0
Apache Airflow version
2.8.0
Operating System
Debian GNU/Linux 11 (bullseye)
Deployment
Official Apache Airflow Helm Chart
Deployment details
No response
What happened
I did a little of investigation and edited
hooks/databricks_sql.py
file and added prints in the_make_common_data_structure
method:result
variable value:rows_fields
var value:What you think should happen instead
No response
How to reproduce
Anything else
No response
Are you willing to submit PR?
Code of Conduct