I need to write spark TimestampNTZ column to an BigQuery table. The datetime type would be ideal, but I'm facing below error:
"pyspark.errors.exceptions.captured.IllegalArgumentException: Data type not expected: timestamp_ntz".
When casting the column to StringType, the target column type changes to String (even in case of an existing, empty table). In case of non-empty target table there is:
"com.google.cloud.bigquery.connector.common.BigQueryConnectorException$InvalidSchemaException: Destination table's schema is not compatible with dataframe's schema".
The prefered method for me is indirect write, but this issue occurs for the direct write as well.
Spark version 3.4.3
com.google.cloud.spark_spark-bigquery-with-dependencies_2.13-0.36.2.jar
Hi,
I need to write spark TimestampNTZ column to an BigQuery table. The datetime type would be ideal, but I'm facing below error: "pyspark.errors.exceptions.captured.IllegalArgumentException: Data type not expected: timestamp_ntz".
When casting the column to StringType, the target column type changes to String (even in case of an existing, empty table). In case of non-empty target table there is: "com.google.cloud.bigquery.connector.common.BigQueryConnectorException$InvalidSchemaException: Destination table's schema is not compatible with dataframe's schema".
The prefered method for me is indirect write, but this issue occurs for the direct write as well.
Spark version 3.4.3 com.google.cloud.spark_spark-bigquery-with-dependencies_2.13-0.36.2.jar