Open ArnoldHueteG opened 5 months ago
Thank you @ArnoldHueteG for this feature request, it would indeed allow for for more flexible data loading. Could you clarify on the use case here - is it for an existing table, and the dataframe to load may lack certain columns, although its schema is provided? More specifically, should the extra column be present already, and if not, should it be added?
the enhancement I am proposing is for the load_table_from_dataframe function to proceed with data loading even when certain schema columns are missing in the DataFrame, and automatically assign null values to these missing, nullable fields.
is it for an existing table, and the dataframe to load may lack certain columns, although its schema is provided?
We do have this exact use case. It would indeed be great to have that flexibility.
Sorry for the late reply. The reason why we enforced that the dataframe must contain every column in the schema, was to make it easier to catch typos in the schema. So essentially there are two conflicting corner cases we want to handle. Maybe we can make the error message a warning instead? WDYT @tswast?
A warning for missing fields sounds like a good solution to me.
This is a much needed functionality. Please allow the load job to pass with warn if df doesn't have the columns from schema. This is much needed when we do WRITE_APPEND.
A warning for missing fields sounds like a good solution to me.
Any updates on when this can be expected?
Description:
Environment details
OS: MacOS Sonoma 14.1.1 Python version: 3.10 google-cloud-bigquery version: 3.17.1
Steps to reproduce
Create a BigQuery schema with additional fields not present in the DataFrame. Use load_table_from_dataframe with the defined schema to load data into BigQuery.
Current behavior Currently, when using load_table_from_dataframe from the Python BigQuery client, if the provided schema contains fields that are not present in the DataFrame, a ValueError is raised: ValueError: bq_schema contains fields not present in dataframe: {'field_not_present'}.
Expected behavior
In contrast to the command line behavior when loading JSON data into a BigQuery table, the Python client currently requires a strict match between the DataFrame columns and the provided schema. This behavior can be limiting, as the command line tool does not enforce this match when loading json data.
I propose that load_table_from_dataframe be enhanced to allow a more flexible schema matching, similar to the command line tool's behavior. Specifically, it should not raise an error if the schema contains additional fields not present in the DataFrame. This would allow for more versatile data loading scenarios where the DataFrame might not always have the complete set of fields defined in the BigQuery table schema.
Use case This feature would be particularly useful in scenarios where the DataFrame is dynamically generated and might not always contain the full set of fields as per the BigQuery schema. Allowing the function to ignore extra schema fields would enable more flexible and robust data loading operations.