Closed MARCTELLY closed 2 years ago
Can you please provide the schema of the table in BigQuery?
Hello @davidrabinowitz @suryasoma I made an update of the issue
Hello @MARCTELLY, the issue is fixed and the fix would be available in the next release. Thanks
Hello, I am having the same issue. I write every day with a workflow to an existent table since 2020 and since june 1, the date of the latest release, i could not write anymore because of the same problem commented: Caused by: java.lang.IllegalArgumentException: com.google.cloud.bigquery.connector.common.BigQueryConnectorException$InvalidSchemaException: Destination table's schema is not compatible with dataframe's schema
So, when will be the next release @suryasoma ? or how can i fix this now (it is very important to my work)?
Thanks
Hi, i have the same issue but i've been able to do a workaround by forcing my field in spark to be nullable before writing, since my BigQuery Schema was configured that way.
seems this issue is affecting version 0.25.0? 0.24.2 is ok
I changed the version to a previous one, and works. Thanks!
Hey everyone, please find the fix for this in the latest release 0.25.2 Thanks
Does the issue still persist in 0.34? Tried to run python dbt model: first time the table gets created, on the subsequent runs (with the same data) it gives an error "Destination table's schema is not compatible with dataframe's schema"
I shouldn't. Can you please provide more detail, preferably as a new issue?
thank you, posted here: https://github.com/GoogleCloudDataproc/spark-bigquery-connector/issues/1149
Hi all,
Since the last release of the connector some of our Spark jobs started to fail with the following error:
After some investigations, this seems to be related to the PR #613, which added a schema equality check before writing.
For example: Given a BigQuery table with the following schema (2 nullable string columns):
And the following Spark dataframe (1 nullable string column):
The following code worked:
With the new version this does not work, as well as the following code, both raising the exception shown earlier:
However, the following one (adding the missing nullable column) works:
Below I propose a test to be added to
spark-bigquery-tests/src/main/java/com/google/cloud/spark/bigquery/integration/WriteIntegrationTestBase.java
to ensure the continuity of the previous behavior:Kind regards.