Open mikeperello-scopely opened 9 months ago
This would be super-helpful in the case of merge queries. It happens that you need to update and/or add rows to a table. In that case, the pattern is to write the new data to a temporary table and then perform the merge query. Currently, using the connector, you can't perform the merge query and the temporary table delete. If you could add an option to set the expiration timestamp to the table, you just need to perform the merge outside of spark
This would be helpful in general to create temp tables via the Spark Dataframe APIs
Hi,
I am using the BigQuery connector to transfer data to a BQ table. I am wondering if there is an option to set a table expiration when using the
dataframe.write()
operation. I see that the connector supports other types of expiration, but not this one.Thank you!