googleapis / python-bigquery

Apache License 2.0
719 stars 288 forks source link

Allow load_table_from_dataframe to Ignore Extra Schema Fields #1812

Open ArnoldHueteG opened 5 months ago

ArnoldHueteG commented 5 months ago

Description:

Environment details

OS: MacOS Sonoma 14.1.1 Python version: 3.10 google-cloud-bigquery version: 3.17.1

Steps to reproduce

Create a BigQuery schema with additional fields not present in the DataFrame. Use load_table_from_dataframe with the defined schema to load data into BigQuery.

from google.cloud import bigquery
from google.oauth2 import service_account
import pandas as pd

credentials = service_account.Credentials.from_service_account_file(
    'credentials.json')
client = bigquery.Client(credentials=credentials, project=credentials.project_id)

json_data = [
    {
        "name": "Alice",
        "age": 30
    },
    {
        "name": "Bob",
        "age": 25
    }
]

df = pd.DataFrame(json_data)

job_config = bigquery.LoadJobConfig(schema=[
    bigquery.SchemaField("name", "STRING"),
    bigquery.SchemaField("age", "INTEGER"),
    bigquery.SchemaField("field_not_present", "INTEGER"),
],
job_timeout_ms=5)
load_job = client.load_table_from_dataframe(
    df, "xenon-world-399922.oss.your_table",
    job_config=job_config
)
load_job.result()

Current behavior Currently, when using load_table_from_dataframe from the Python BigQuery client, if the provided schema contains fields that are not present in the DataFrame, a ValueError is raised: ValueError: bq_schema contains fields not present in dataframe: {'field_not_present'}.

Expected behavior

In contrast to the command line behavior when loading JSON data into a BigQuery table, the Python client currently requires a strict match between the DataFrame columns and the provided schema. This behavior can be limiting, as the command line tool does not enforce this match when loading json data.

I propose that load_table_from_dataframe be enhanced to allow a more flexible schema matching, similar to the command line tool's behavior. Specifically, it should not raise an error if the schema contains additional fields not present in the DataFrame. This would allow for more versatile data loading scenarios where the DataFrame might not always have the complete set of fields defined in the BigQuery table schema.

Use case This feature would be particularly useful in scenarios where the DataFrame is dynamically generated and might not always contain the full set of fields as per the BigQuery schema. Allowing the function to ignore extra schema fields would enable more flexible and robust data loading operations.

Linchin commented 4 months ago

Thank you @ArnoldHueteG for this feature request, it would indeed allow for for more flexible data loading. Could you clarify on the use case here - is it for an existing table, and the dataframe to load may lack certain columns, although its schema is provided? More specifically, should the extra column be present already, and if not, should it be added?

ArnoldHueteG commented 4 months ago

the enhancement I am proposing is for the load_table_from_dataframe function to proceed with data loading even when certain schema columns are missing in the DataFrame, and automatically assign null values to these missing, nullable fields.

aaaaahaaaaa commented 4 months ago

is it for an existing table, and the dataframe to load may lack certain columns, although its schema is provided?

We do have this exact use case. It would indeed be great to have that flexibility.

Linchin commented 4 months ago

Sorry for the late reply. The reason why we enforced that the dataframe must contain every column in the schema, was to make it easier to catch typos in the schema. So essentially there are two conflicting corner cases we want to handle. Maybe we can make the error message a warning instead? WDYT @tswast?

tswast commented 3 months ago

A warning for missing fields sounds like a good solution to me.

kabilan-dt commented 1 month ago

This is a much needed functionality. Please allow the load job to pass with warn if df doesn't have the columns from schema. This is much needed when we do WRITE_APPEND.

leonpawelzik commented 3 weeks ago

A warning for missing fields sounds like a good solution to me.

Any updates on when this can be expected?