Open wylbee opened 1 year ago
Hey @brown5628 - thanks for opening this! For sake of auditing, would you mind installing your fork in the Snowflake-based project you have and running a query that uses the dataset
macro to validate that the query compiles and runs successfully, then pasting a screenshot of the dbt CLI logs or pasting the compiled query here for reference? Sorry for the tedious ask, but since the project's CI pipeline only runs on duckdb and this PR is a Snowflake-specific fix, it'd be nice to have evidence of the code change working as expected.
@bcodell Should be ready for review. Thanks for your patience on this one with the slow turnaround time. Quick callouts:
Two questions from me:
sqlfmt
. Happy to turn that off and restate a cleaner diff if that is preferred, just let me know.json_extract({{ dbt_activity_schema.primary() }}.feature_json, 'type') = json_extract({{ dbt_activity_schema.appended() }}.feature_json, 'type')
to parse_json({{ dbt_activity_schema.primary() }}.feature_json):"type"= parse_json({{ dbt_activity_schema.appended() }}.feature_json):"type"
model
configs in the dbt_project.yml
Per #26, the generic implementation of the
_min_or_max
macro does not work correctly on Snowflake, with the root cause being dbt'ssafe_cast
using a function in Snowflake with limitations that make it unsuitable for this purpose.Given that this macro is intended to create the equivalent of Snowflake's
min_by
/max_by
functions, the work of this PR is to:Test 1- Integration Suite
Test 2- Prod project
dbt_project.yml
varsquery
compiled
query