Open edudosaara opened 5 months ago
Thanks for reporting this @edudosaara !
After getting insights from @MichelleArk, we believe we should probably document this as a known limitation, and I've opened https://github.com/dbt-labs/docs.getdbt.com/issues/5307 as a result.
All fields in a BigQuery struct
need to be specified in a unit test -- it's not currently possible to use only a subset of columns in a struct
.
But we agree that we'd like to support this.
To support this, it would probably take writing some struct
-specific handling of missing fields in the safe_cast
method.
Include all the fields in a BigQuery struct
within the unit test.
Is this a new bug in dbt-bigquery?
Current Behavior
Unit testing models is fine at row level. But when dbt tries to create a fixture for a nested struct, it does not safe cast all the missing properties. Even the nullable properties of any struct need to be inserted manually at the test in order to make it run without errors.
Expected Behavior
When compiling unit tests, dbt's fixture generation should create safe casted nulls recursively for every missing property of nested structs.
Steps To Reproduce
SELECT * FROM {{ source('gbq_dataset', 'example_table') }}
The resulting compiled SQL will be like this:
with dbtcte__example_table as (
-- Fixture for example_table select safe_cast(1 as NUMERIC) as id, safe_cast('test' as STRING) as element_name, (select array_agg(safe_cast(i as struct<
name
string,description
string,date
timestamp>)) from unnest([ struct("John" as name) ]) i) as content ) SELECT * FROM dbtcte__example_table ) as __dbt_sbq where false and current_timestamp() = current_timestamp() limit 0Additional Context
No response