snowflakedb / spark-snowflake

Snowflake Data Source for Apache Spark.
http://www.snowflake.net
Apache License 2.0
219 stars 99 forks source link

No option in load data in spark to snowflake for adding new columns to existing table using overwrite mode #554

Open pragathisharma-8451 opened 6 months ago

pragathisharma-8451 commented 6 months ago

The existing table my_table in snowflake has 10 columns and we want to add additional 7 columns into the table

new_df holds 17 columns

new_df.write.format("net.snowflake.spark.snowflake") \ .options(**sfOptions) \ .option("dbtable", 'my_table') \ .mode("overwrite") \ .option("usestagingtable", "off") \ .save()

An error occurred while calling o761.save. : java.sql.SQLException: Status of query associated with resultSet is FAILED_WITH_ERROR. Number of columns in file (17) does not match that of the corresponding table (10), use file format option error_on_column_count_mismatch=false to ignore this error

Tried various options such as .option("column_mismatch_behavior" , 'ignore') and .option("error_on_column_count_mismatch","false")

Did not work. Kindly let me know if there are ways to handle it. As a workaround, We are looking to alter the table by adding additional columns before overwriting OR write to a new temp table and swap to my_table.