fivetran / dbt_fivetran_log

Data models for Fivetran's internal log connector built using dbt.
https://fivetran.github.io/dbt_fivetran_log/
Apache License 2.0
30 stars 24 forks source link

[Bug] `fivetran_log_json_parse.string_path` arg should be an array in `fivetran_platform__audit_user_activity` model #124

Closed fivetran-jamie closed 3 months ago

fivetran-jamie commented 4 months ago

Is there an existing issue for this?

Describe the issue

On line 5 we have {{ fivetran_log.fivetran_log_json_parse(string='message_data', string_path='actor') }} as actor_email

But it should be {{ fivetran_log.fivetran_log_json_parse(string='message_data', string_path=['actor']) }} as actor_email

This is how the macro is used everywhere else.

This causes the json parsing to not output anything and therefore limit the output of the fivetran_platform__audit_user_activity model

Relevant error log or model output

Compiled fivetran_platform__audit_user_activity code for Snowflake:
try_parse_json(message_data)['a']['c']['t']['o']['r'] as actor_email

For Postgres:
  case when message_data ~ '^\s*[\{].*[\}]?\s*$' -- Postgres has no native json check, so this will check the string for indicators of a JSON object
    then message_data::json #>> '{a,c,t,o,r}'
    else null end

For BQ:
json_extract_scalar(message_data, '$.a.c.t.o.r')

etc.

Expected behavior

I would expect the macro to compile to:

Snowflake try_parse_json(message_data)['actor'] as actor_email

Postgres case when message_data ~ '^\s*[\{].*[\}]?\s*$' -- Postgres has no native json check, so this will check the string for indicators of a JSON object then message_data::json #>> '{actor}' else null end

BQ json_extract_scalar(message_data, '$.actor')

dbt Project configurations

na

Package versions

na

What database are you using dbt with?

snowflake

dbt Version

na

Additional Context

https://fivetran.slack.com/archives/C02919TN9AT/p1714746313138259

Are you willing to open a PR to help address this issue?