Open tswast opened 2 years ago
Currently, this failure happens locally due to generated schema != server-side schema. Even without that client-side check, I think this failure could still happen with api_method="load_csv"
, as it also uses the generated schema if one is not provided.
Actually, there is a workaround. The user can manually specify a schema if they want DATETIME instead of TIMESTAMP.
...
pandas_gbq.to_gbq(
df,
destination,
if_exists="append",
table_schema=[
{"name": "row_num", "type": "INTEGER"},
{"name": "some_datetime", "type": "DATETIME"},
]
)
I might change this to a Feature Request, as I believe this is a known way of dealing with the ambiguity between TIMESTAMP and DATETIME.
+1 for DateTime support instead of just Timestamp. Currently all my DateTime columns are being uploaded as Timestamps with the to_gbq function unless I specify the column is of type 'DATETIME' with the table_schema argument.
Steps to reproduce
Code example
Stack trace